var/home/core/zuul-output/0000755000175000017500000000000015145555247014542 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145572417015504 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000331446615145572272020301 0ustar corecoreikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB >لI_翪|mvşo#oVݏKf+ovpZjl>?xI[mEy},fۮWe~7Nû/wb~1;ZxsY~ݳ( 2[$7۫j{Zw鶾z?&~|XLXlN_/:oXx$%X"LADA@@tkޕf{5Wbx=@^J})K3x~JkwI|YowS˷j̶֛]/8 N Rm(of`\r\L>{Jm 0{vR̍>dQQ.aLk~g\UlxDJfw6xi1U2 c#FD?2SgafO3|,ejoLR3[ D HJP1Ub2i]$HU^L_cZ_:F9TJJ{,mvgL;: ԓ$a;ɾ7lַ;̵3](uX|&kΆ2fb4NvS)f$UX dcю)""û5h< #чOɁ^˺b}0w8_jiB8.^s?Hs,&,#zd4XBu!.F"`a"BD) ᧁQZ-D\h]Q!]Z8HGU=y&|'oZƧe7ΣԟRxxXԨkJ[8 ";ЗH F=y܇sθm@%*'9qvD]9X&;cɻs0I٘]_fy tt('/V/TB/ap+V9g%$P[4D2L'1bЛ]\s΍ic-ܕ4+ޥ^.w[A9/vb֜}>| TXNrdTs>RDPhإek-*듌D[5l2_nH[׫yTNʹ<ws~^B.Ǔg'AS'E`hmsJU # DuT%ZPt_WďPv`9 C|mRj)CMitmu׀svRڡc0SAA\c}or|MKrO] g"tta[I!;c%6$V<[+*J:AI \:-rR b B"~?4 W4B3lLRD|@Kfځ9g ? j럚Sř>]uw`C}-{C):fUr6v`mSΟ1c/n߭!'Y|7#RI)X)yCBoX^P\Ja 79clw/H tBFKskޒ1,%$BվCh,xɦS7PKi0>,A==lM9Ɍm4ެ˧jOC d-saܺCY "D^&M){ߘ>:i V4nQi1h$Zb)ŠȃAݢCj|<~cQ7Q!q/pCTSqQyN,QEFKBmw&X(q8e&щu##Ct9Btka7v Ө⸇N~AE6xd~?D ^`wC4na~Uc)(l fJw>]cNdusmUSTYh>Eeք DKiPo`3 aezH5^n(}+~hX(d#iI@YUXPKL:3LVY~,nbW;W8QufiŒSq3<uqMQhiae̱F+,~Mn3 09WAu@>4Cr+N\9fǶy{0$Swwu,4iL%8nFВFL2#h5+C:D6A@5D!p=T,ښVcX㯡`2\fIԖ{[R:+I:6&&{Ldrǒ*!;[tʡP=_RFZx[|mi ǿ/&GioWiO[BdG.*)Ym<`-RAJLڈ}D1ykd7"/6sF%%´ƭ*( :xB_2YKoSrm_7dPΣ|ͣn/𚃚p9w#z A7yTJ$KOL-aP+;;%+_6'Sr|@2nQ{aK|bjܒ^o(מO80$QxBcXE ء\G=~j{Mܚ: hLT!uP_T{G7C]Ch',ެJG~Jc{xt zܳ'鮱iX%x/QOݸ}S^vv^2M!.xR0I(կѶO:#'6RE'E3 */HAYk|z|ءPQgOJӚ:ƞŵ׉5'{#ޢ1c qw zǽ0 2mK:ȔsGdurWMF*֢v|EC#{usSMiI S/jﴍ8wPVC P2EU:F4!ʢlQHZ9E CBU)Y(S8)c yO[E}Lc&ld\{ELO3芷AgX*;RgXGdCgX JgX2*Ъ3:O7ǭ3ږA :}d,ZByXϯ&Ksg3["66hŢFD&iQCFd4%h= z{tKmdߟ9i {A.:Mw~^`X\u6|6rcIF3b9O:j 2IN…D% YCUI}~;XI썋Fqil><UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'T[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄwfm#Y~!%rpWMEWMjbn(ek~iQ( /2,?O .|!p+,ICEguB`|M3J#BQȌ6DNnCˣ"F$/Qx%m&Fs_7P|ޢ?I-RiAKo(|(4s\9#.\r= (mO(f=rWmWɂN$r{2k혿q}lrCy u)xF$Z83Ec罋}[εUX%}< ݻln"sv&{b%^AAoۺ(I#hKD:Bߩ#蘈f=9oN*.Ѓ M#JC1?tean`3-SHq$2[ĜSjX:RxyؖA]Y`7l@ΖLl 'Ej gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IX/C7@u ׅkj)n'^52&I pѴOw4ǛJ5H 7B`j:E]`C 8蟫n'Ą*?X3l(?  CBX`]ڦÞhkX_y[dīP͐b7jk *bmc`  S'kmOl7^~xAE,Pmqs;l};Щ۸l?28Ćn.OI0Yhu ;ZeY}Qg?lvחzäTC 4zvkpp3n_9>Byݝ0ߒ5bZ8ւ 6{Sf觋-V=Oߖm!6B< f`mPіpJЦXn6'5m 7aTcTA,} =d#uЇ  l]չoݴmqR".lCp Qo^_ K]CFkM\7"Ǻz鈷:ݓ3<:~iXN9`2ŦzhѤ^ MW`tw] ߇ ]2x~f> Wa!fA[?~R6*.9t,綨  6DFe^u; +֡X/@: pa} O K`pC?fE?o bcWa~fh.8C>n - kTuwmUr%ԀjƮĀdM#^ۈӕ3NeBO.wiᚡmi@ zF(n&P;pS俺kCߤ`ZnTV*P,rq<-mOK[[ߢm۽ȑt^?bl\)cb<>O0BďJrDd\TDFMEr~q#i}$y3.*j) qQa% |`bEۈ8S 95JͩASX~߃ʟ~㍖›f!OI1R~-6͘!?/Vvot4~6I@GNݖ-m[d|]>4(sgz1v&Y^eO?!hYf.WNW݃yh~%DTt^W7q.@ L⃳662G,:* $: e~7[/P%F on~$dƹɥO"dޢt|BpYqc@P`ڄj҆anCѢMU Sf`Yɇك]@Rɯ?ٽqS}NEij]Qexx*lJF#+L@-ՑQz֬]")JC])"K{v|qk" L+Y*Ha)j~pu7ި!v=G1tҀ\}74C%\n2'>^&pn08p15w q L:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?i)yh=x>5܉Q~O_y琇HBzI 4΍ +aM(VldX ][T !Ȱ|HN~6y,⒊)$e{)SR#kהyϛ7^i58f4PmB8 Y{qeφvk73:1@ƛ{f8IGv*1yx27M=>+VnG;\<|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɧ:"nKErHc1FYbQ F;v?[uLU4lZ[xEN'oI㤛rP*jC" 6@dmrx}dP!NZgҽ9l*(H Žڒ;̼o|%D Ɖ`Pj . ֈ,ixp`ttOKBDޘGGaLA2s0(G2E<I:xsB6<*d42I:<ŋu#~qu}hW<2~sQ3W.&lnľU74c?MՏڣW@ -N*CB=i3,qjGkUſu2k Cb8hs&sM@-=X(i7=nAHe%ISd$&iA|i Miݸn1qaY4ͬlygc0Ȋt ݪbh]LT怷䀩S'qf&)-_G;"1qz0,#yiq$ՁɄ)KemWw q>fu6+'}x{u\Vee9]`3=?Ųm gp8펠هynpEOkÈWȤMف 8o?))H;h8_ߍ5S&(w9Z,K44|MZIt1!ʐDN a$0Y&Hv]9Zzz+]}%b'$^LJ<\#z18m@n1YHR=53hHT( Q(e@-#!'^Ak$wTg1!H$|HBTf̋ Y@uwiFī h[W,Ê=j8&d ԋU.I{7O=%iGtxvBչ̳@1+^.r%V12, _'j"2@+ wm 4\xNtqwc&dC,m+-!sF=$ts,cJZڗOx2c6 .1zҪR "ރQ[ TF )㢥M-GicQ\BL(hO7zNa>>'=+[gS{:/UooD8q̒vvW3%9pM&jV3=ɹvY[3iOI4Kp5 d2ﯧgd||K>R19'0LL]M[ltFR9I5YpVgtuZfG{RoZr5٬r;wW:͋nqCRu1y=㊻Ij o{[|[%q0 CJVԻQg9XʐoHKFϗ;'QZg܉_͗-=ۮOR\dIoHZ6n`R֑& .Nv0vԬ]I˟_vrM}F9X|FI#g.Gi)%!iKto}|ֵ4Ӥ͋#+hI{hNZt 9`b˭`yD,Ȍ=6Z" 8L O)&O}w \7ix@ D߭P"~G YdЦhhC{[=^YgpUHެbZ!y!ul@ڼ63"7 ۩:6=TZõ$E,I)+>n#y 9D*A$$"^)dVQ.(rO6߭;I/d0oIU:mNϖ~[*K1QA="D:V&f:{N:>^u]` c/X)mS5KC?":{H)"%,!37{"ZWÂk>F?RJ>FIY*%=HgS~7Ď89o؟opgÞ tNXB-Gjsٴ 3Gz~p؍H|*cyp@\9/[up`uV,\[CB\qGiË_ßmfWͽ2]Q6ë՘`_ǍKnK"]p<)Xg '鸽 &Xu=z`ng[#O"=5_6vg3gR(Җ}f`ӀS.8?a?2Đ+EV؎x0bv6 fd1 2ӎK-6xAH1H#:f7o{bv/34'h9Dݥ:U:V[ 'Mȥ@ەP8Iڏ~ԥ; R,wt'/ ߐu"8c`td 1xh˂U4Uʨ}~m%f)icʚEAR6r9!Go3 "ӎ̏/k{8L%H㸡]V.lM>*2뭸 kn=V%ŐnafGw1nrC%іoU ' 4IŹMt}.l]El>Ms|//fׇ&!B ;&g\,}F)L b߀Mi^+n7EY3Λ%|C6㞊*%S1qZ|ZǏ\tr>Ah}ʪϷQ22 ~ V7bpD}}KjF649tV|̢T`"T:*Da*nIClz^F6!ܠqK%$?ERz).JC]?P(ՏPJS3.}cQ'8Oq' .=e\0zE|!@E " ;9Ώf3kZc7Bɪ"O+mmBz-p^:ZYUv`Ƌ-v|u>r,8.7uO`c Nc0%Ն R C%_ EV a"҅4 |T!DdǍ- .™5,V:;[g./0 +v䤗dWF >:֓[@ QPltsHtQ$J==O!;*>ohǖVa[|E7e0ϕ9Uyzg%pg/cc6RS`HFLЩ LkJu\!`0);Sak$Vfp~C%YdE6c>1ƕ (0W4Q>@>lWN"^ X5G-nm.8B>NOI[31,j2 Ce |M>8l WIf|\q4|UkC.gr`˱Lϰ} xr.~l-ɩu_Drd31V_ѺUib0/ %IYhq ҕ  O UA!wY~ -`%Űb`\mS38W1`vOF7/.C!Pu&Jm l?Q>}O+D7 P=x@`0ʿ26a>d Bqε^a'NԋsI`Yu.7v$Rt)Ag:ݙyX|HkX cU82IP qgzkX=>׻K߉J%E92' ]qҙ%rXgs+"sc9| ]>T]"JرWBΌ-zJS-~y30G@U#=h7) ^EUB Q:>9W΀çM{?`c`uRljצXr:l`T~IQg\Ѝpgu#QH! ,/3`~eB|C1Yg~ؼ/5I7w9I}qww}U~7뭱ԏ,}e7]ukDn`jSlQ7DžHa/EU^IpYWW兹Q7WyTz|nˇ _qˍ[!;n ^b k[);ng]ȶM_u)O_xV hx h[K2kـ`b duhq[..cS'5YO@˒ӓdcY'HAKq^$8`b $1r Qz?ۧ1ZM/G+qYcYl YhD$kt_TId E$dS:֢̆ ?GЅ'JƖ'ZXO݇'kJՂU086\h%1GK(Yn% ']Q; Gd:!gI-XEmkF}:~0}4t3Qf5xd\hEB-} |q*ȃThLj'sQ %؇Gk`F;Sl\h)5؈x2Ld="KԦ:EVewN ًS9d#$*u>>I#lX9vW !&H2kVyKZt<cm^] bCD6b&>9VE7e4p +{&g߷2KY,`Wf1_ܑMYٚ'`ySc4ΔV`nI+ƳC6;җ2ct"*5S}t)eNqǪP@o`co ˎ<عLۀG\ 7۶+q|YRiĹ zm/bcK3;=,7}RqT vvFI O0]&5uKMf#pDTk6yi*cem:y0W|1u CWL;oG^\ X5.aRߦ[_Vs? Ž^A12JQ̛XL:OEUپOY>WK-uP0\8"M: /P4Qz~j3 .-8NJ|!N9/|a|>lX9T ҇t~T1=UF"t; 8-1I|2L+)WȱL˿ˍ-038D*0-)ZyT13`tTnm|Yhi+lQ&Z!֨řoҒ"HKX 6„=z{Ҍ5+P1;ڇ6UNE@Uo/>8.fgW]kY0Cgcu6/!_Ɩ} ' Ў3)X<seWfSv!ؒRKfs%(1Lhrٵ L.] s?I,HCԢ[b C-lLG+@_$c%* _jR|\:dc5u= A@kUc\ǔz;M>dUN/aFRĦ@x؂ǀ$6%}N^ \mQ!%8j0dUo=rh>*YȴU3Q,̸*E%59sTzɟڮ2kg ۱wEUD3uKrr&"B:p`\E)j<).R&#ÃecE,dp"nPS 44 Q8ZƈKnnJei+^z '3JDbSK;*uБ:hF ѹ @˿ޗ~7g9| hLXULi7.1-Qk%ƩJ4^=ple;u.6vQe UZAl *^Vif]>HUd6ƕ̽=T/se+ϙK$S`hnOcE(Tcr!:8UL | 8 !t Q7jk=nn7J0ܽ0{GGL'_So^ʮL_'s%eU+U+ȳlX6}i@djӃfb -u-w~ r}plK;ֽ=nlmuo[`wdй d:[mS%uTڪ?>={2])|Ը>U{s]^l`+ ja^9c5~nZjA|ЩJs Va[~ۗ#rri# zLdMl?6o AMҪ1Ez&I2Wwߎ|7.sW\zk﯊溺^TW^T\*6eqr/^T77WNZ7F_}-򲺺VWQ77V\_v?9?"Th $LqQjiXMlk1=VzpO֠24hf 1hi D{q:v%̈#v^nBi~MefZF >:/?Ac 1M'I`22؆DT!/j璓P åiw@wgRCsT~$U>ceއE)BI>UljO|Ty$ŋrwOtZ7$ "i 8U 7bSem'޵>mE/K(x8NzN֧t'WHb 2&NGIXUgOX&)$MQbJ/4_2?V5(D 1.<"*D)ø+,ʸwƦ[HcR$+Eߌz|E UZ~SVssg*8Oܞ$~C,o"KGq*Sh-ţ2R,& pHiuAU=o!CUnoj|37Yc;U]_#Xȱ ?^J4,GL_1-0-zBg/)`8!vj2SRmlNI WnX,Ɨs#0 ŅL#p__+c/=i,Ë,Oz/+4Hz _Y\Y!2^Wls+t}0"t-׋B;A&6 wLBX+dE+1呫j, ħ.h6BCst5LBׄξDAgG<likֽ:Z]o2 ɹWtjs}:{?EhfN0EMc M#Ͽm幞FHrm=V_>``8~I7]K7=s̟"B$?U込F2Sc9A.c1r'PXA{}eo?UG `NΥ,꺨^nլe5kW [<*Wl$Zةed I7T׉6ee80_`~)8IPL% PLnĝw?YYu\4]48sp%`NNǿ7AB?t6w3hKD6=͍Ma6=_r0M{r5%h 7NQĭ G4giH|%gi WlݟsM]sIx1G-x!2oLdbv^qKg%] ySˣp]+cEBsP)P gXkxCbҧMRǟn4,? }g"@7|uߠ2<>:J0L~ [boaa,!BfK$H 3.mdR'YQ<vSY9Hrt+~Z$1t} ѻ,;֎Q_~=uM$@:tطK!gqɐ;hS˽sGw_Bx1Hп(w""lht:& aGPhD'Vc'DZnf~"#$oR4HVD'8MxKoi wp ݇Y|F}ԩRZ+qӞ]s~ ރGC74}<?k-ZÃYQ Y)0Og8ARc4]8@uQd>}f!4mW0..cmGE{3Z]ˈ VHmChj{/Hb0/t\V*LŝmhuKEu0#{ L7-8;ʚW, 4@m (-69m D_`E.0#QՔcϬhktl>Aт>y}?dyDiE#jiwѵ9<.sJU׊hi ~+ص m eiVk6P_i[A@i }j⦵q*E!џgSdC W$^ Alfyp _ز標%:~SV 3]cj TAG Iw= e$>O߽+h,Mη$Ԟ%UKߖU; ׉S"זmILpSJ.EVzީ)f0ogy{Llvր0 omRk䱽:lZi-gy)RҌni{:s$0=O#2>4ac[PDlɻ{z뎓4P&(b鐖g*\u沞␵\;`!&ʇt]kK*MID8sKc-W.*Z4L"W^XZĭgD=˕2$L/K6\Kv 6 P=]Da;a8mڃ|ƾ,E{g1 {7~9Pͫ~s=گcJ_mN;p=3ދy ?=˙~ZYrF9X-}/vCu8^t-9;sF|889#g9P>H+~rfrN? *9azc eFQpN2 ,GKo`O$bM?PQm&<7N7$ Db~>BL6ai*7ZnonSuӤntB` k3QT&A(%! -?X'[dneDUdz,h7( ˜w畸!ea] oF+3iIbxFฏiw{lw@aP$esZ5c^IdّUtM6コM@fwGe>Ete4~[(Nk}־x 0 C#'`10 yM0y*>3(ź'fX ..}{_|Y ɷI4mȳizVjh P럧7+ҟ^>>M(hLG1jգ6UB =5W;ώilsu^i_͢ek:į3UV~*CR37i34 U9["=L<ʣqTV~}"nS8Ex@mRZJHāuCd +m.# .K1Ȼ,yrߧg_gGo~<b , uEK(7@1(N@lս P:/NW< NEV60XXv"^*Dɦ_ہao5p'n60ӑ\N"֬eF@ D"ۅ+P)ap "#WbN=`n7";RH2.b  /| $@j'L@j7v< B d shWrJ P at`78Zm Ra!@Qzhbt# ޾+׳ݎġ|)eшhqлu!Epޡwcz>2D +A8F^KЋp}$j׍BpB ;. 8 ꭽ@>vГSת^aNC(OU+k9+l(FAk`򼻷j`N A "u j`o׿,ht/J0+)H0* xW\/,͌@t~/@qc̠I{ ʖ@0p 0]p*-mph7L1m˶+-;vxg|WyS1_* Ip3 u`iuqZd ) P $X _]D( dCP"YUR 2i 1ޓYdTb nb0Ά`5ŒXڟo <'BxSfMxjbE [6W,EL9R1 (qbS/_oz7ˊ1٬8etqjI\FDGi b{!wC{^M,&\4nk\8T+jqپ#u\~ #ȸἍ` %G3M7`o&*3-,MVP>t.WjԏQ[ bm:fnk# ^G&!o"iQXϚ4oAoq/ԙ,(SXrsf3*$q4氹)Oo`#\rHNA%|Ve0\u QKzne~| I3}j+d!3jkL+`O?;Bq$=:;&7*hСZԓfh3ΧL.><>r12#40Eҁq.)Uq^RF%f W'K%E KykQH'(LD=.:y,a,Rz Io7IěS=;9A/z,ly}|cڽdHЅ [ _v͸mwp|n|*Im,5EhI6AR3Of A4c Z5ws HyZu+wZ}.ß"VQ (ZZkO5 V k 8S Բ.H4w1=1jO64XŽPٿ@U^# }vf. XԅAN?N8~.`#>Nk60H\ ZD370>oh @Luf)+4YN)6XgHdsâj?L4@N7f5_I@A*W\7d]}/߅٦}-m|ՇN0~Sm! !Hzo@M7=tafЎB_VF`*HM`J^XCѿ ɷS6vg^ߵK޿ť7iZ|,`x 0+2%N2 ~?}t(>s{6yӠh0?O+G g+ :!]Wr1+@ѓ6Nb>T6! -y?qOe"1D2A+Ը S-Uǩzj4V*S@GDz뭬Ngp/Fl0{=B)ĖB_$+dCaqN˵=}u:U6OUz eeY;#9ϴO:h,Х_HZ^ }HWvHF*V6X_7_%ф]:'[btlѕ?;NSU~ŽOߜ(}U2Q ֈZ¶y 3*55nNK@8 o0k>+x}Wn&}=o"݇H':վr hu*F+ h/f EeR%_Щq%ă_ jUd#;[-ͺR8fk D72mU??ռ#]Oa̱}9 tcH t!rZdu^{R6$ZdtTѾg LDD<ڇHde\fFpd 'oQ`b*úƨFEI43܅fE41&5fe"dҥX7g!hxV*&Z*^a?*CZ)dJ)K u}C䭚;4I]t,T&s֗iJT[Lu1-7N{mڜEmLO81C8kqJKuzK|ReG&.:4iD* =Εh58n*Dž H⊨}ImvŘFMWJNBty@kXRV[B2E:xy=稉 ΛiSQBq'+KêODr[p~#T]Pϐ1H4c8Mg]Yo[ǒ/~'ոE 0.rUEQ IK>$sŌSӝ %'s>wu_7]hÇXE--=w^w98'Qݰ`1ӯMf.Z,yuGzտZѼo?ZGN_u]~^Ogj&|Տ\_k^5lo~ A1Zޔخ6I@ WWwz_cZj7i5p0xLaKb>Lr,Zgj4sm)]0a 33" Rx;mJyx|]x@nua`P쌕\ xjk5bԔv1w`"x"|l.z?UJp"*0KP9'WJ(hӓ4uH|*ɭUwʂ xZM5%cRTrG$-VPCLLbEɒ CB+ɣH=m `cC9kp,6N!K \"R=Q1&HꩽNȲrtͦe"c,!nXM2 +|m2 D:,y`H( aJI).8C `-D \xASZ$F=*UZT<9aޘ/*]ѩ'lA6<ԢL+1(A*b P k٨|0[8v~HieיtR50j% XcE Jp&,xi"@n݋YXm҂* "׼O30IИ~ن.y9z5AA҄31i'"I삠{2Fj:3ȢRR#4)Ntiש=D/ī9N+!\C4)Np\xnLa$?viI>w.ܬw2׹G#HNh a'TB'rk@+)3Ix4 xcqkAo|J|6ATY~ꂣ|n ދFʗe|Ő£ 0qQcim'b}niFiF6'j)ӵU[`t3 jJg8SrȌJqKb(fo\9XGk[xЧlܦ#ƺ7Ɠ%քqƴ!ET>PCG g5/|kLlRO] C J YԾ :9 GSK'X7`[BFiJ%IqP5FS# {:ؑݺc7\=v$ = |ݜC`sɡf~Fsᣱ)K R"(ؚ%9`JVIVx GS]GXjLy<֗v^$5Q[ə?m>u`Ggy(x[ \PO+yG/*66)0mM:jq 떛y ^a1R.Od6jH:bҢ*)(%哎t|ṍIsjBcqb6ߊChX謼HoG N|s]pu=fVTpМ. f},Jv㧻m͡rf#)z>lL%߳yb EZ!3=3A;z]pt\򄑆1Zڒ-c-`<&Ppd%vp~>.8_:1p"0!(U ! BmV'.kؤ5 V}O ^Hzki9-mYNE1ͼo v[_L/Yi-¹+d9=11Knp32K礘\?w!:6:$[M&h(Į7~vs_}hroP:&rCTD\P$najd6``fl=I5 xVqCE)V]r~0Yȹ8n) vbg{1x9j1)aV.8ބ0|>hMh}KY.TiMI#vjX.BΝݙۦ6=.8}ܽ3Yļ.mɲeBEe]>ŇO[gt+#׶% /(ےp (>Лe跱.\AP ǰ >d"St.+ܻDs*Ȁ}IGs7{֍9xa4i~Xo}lÎ:iјmW9iA((9VZУrL~ \Zw: MSXd%^ Eܛ1>5R'uQωRИA-R PYh E Rv^ˮG$txbn^`z?!=B\S΁Cn3)_Mb}ԎޚMKU阅 A'(Z Pm>TX_4"tm=#y\Dl85ƶ&KPpRLr%UYP%1ק"2S8iO~0In@;$ DžYu'7if8m$0ރY}#qIxwDZ)\Z;^զq>F!\<@X3{{ Y?6,mF:saknMab+~opHZ9l, B+|GehIl4f"OSy1Nv?뼻ĚHD߾DDZ,Z:sͫ:[n[m$W2P0U`( z~{gAoh{.:ٿ c<>+:|(EvgkoYoLkar'$?>ҫ{o^|xI= d3ЗCӱ@$QB EO0EwD*O=7#4.Ǭypmb㒷p\i^?Z_w`~Cs6DF#Lk~*т%Mi`JU7X*/7r ?랤`[pKc z+㈂,&+E;#?+zo u)S||YeBBчccu^wQBwG͚kkr7.hc4VeTa? :6tfޥ9v m ƭU%Uݑ2kJ8}6v]r* ftFoSp_^H,L9Uk񰘃M~9 [!+T7\e f,?c2fe ^"O`-ϳwW+X&RѾ.ǽrNյ+vZa DzuDR*QŃ*?g`p>ܘ畉X]05WGV32$e 1?Wg8n0Rm}U4*T;NatcMxiy7\aiN|yi#?|J >Eq߯M'B}挫X E(ɜEqjK @i9ǺUJyҽ @=JŽ EX3!M<5)Iޗn\1Erk8!Ԃ9ϵALc46@mR{*miDBIl 2jɯt;`nO\6ن15oV҆a!mN[ Z)\SllD QZ(P?".mШMij1k6z6h8~mЭ9S|,6zNlkrcO~K$3>V`|a$}or?||dVrw&(xBk)EGEOql{-FyfOa4"^&<{onr xӸ ?훻Uo@&!Fe.;}]NσiW]|zOqS)T;D=&1kpJImL.DDkkFӎiFhx4cJD]3" )k9&A@S| ցP1wZ9kkFӎiF׌KA8-ꡒdȄbqCs7< ,]}3q5:lo#iq==鷾5""[kS ECz)nRLۏ:8<67~+o +.;#}'QzI֛m|}0g7`JI4_s~vM8B_V~3' a 1W6]Қ ;ɒ9K/ }3?g\_ͦ)Fhb-5d\qXȦ1z# A喁pn$Hht&.Qkx(E;?N19o|# GvllߑjQJwZZ_Se펛vPf e5`W~cK#hYAI"ܙ}Lin)/Zq#oOl3|. z~8ǖc.lqq<@Æ5Y4=RNWՙMԚl&>ti ĕt&E`Wggtci+4{ɏ[-O3B0URrAwP_$y1 n?d7pW>^U#6Z%hpu:\lj>?.Őµb+/wd0a bJkg_h:Zժ-GbV.-?Dc8jYL20F TbGU0\pDŽ!k)jۢIi矍P\w" mDhNB1LzDJJX XB!).=FH06"C#+@do`Xcu`[S@ӉȟV(Zl6G`SB< U \k`{#iUYV4X9J)b!"PRG;pD` B0p dJ0Ĝ`'n"&Sq-Y{?Xd|Vv] y !]S8AcTHF^C' < H1LZ0gӬUⱦ,AawIXX;^_snbVnY_wipT(1()D$TH)abl VquQiWŻ;J3KEL27&LJsXҤQ\Q]RCu.*> hlGI6mCb9(.\,f.ίq<"9Tpf/:xg5i_̴?ZI> gIe.el pPl BdT =a5z[Z!iAt~d3]_uf4h# f!㋖mXŻu Z5$F)YIVj-!IR1dcƇ=dKI4)7$k0>,2cjˣP ]$XƢ0eCM'elh撇$AdYlff$ &bTj wccp"#rD0h*c!rYB<ӎX9a1/ƉuAuC(&$$4 o**2$H4kL_AI)=Vpv]AR!g8})p~jry1! ס j-3R= {FD]`UhcBH!q&.H:. Fk7cUm>zM#@i:hdv^J3ReBJKH$iP]5>тuV L)5`г-?B+B?Rn/UsYx|uJ診ܿM9f>ޗ+o ָL|%lHR)R#|wύzUڑ^CYeJW3*vYX~a߿l5Fr=:hwks,*).6~..]5kx  O*?|1Aupz0ZkFήkjz4`?\qhoSzT\ݫ_^(ǂ {\:* %=)56lb C$WҠ">4(ie_AIaCM(wnA5=RC.ߩv]zZ%'Qüڞ;z& ǹrM4^@>ǵL.&8#m -7l`JN{XѪ&-]kw-_ 4pz$ J~IOҠyr sZ|NS|y֣t:Jɉ{. ztSJ/8xT3:fn `-_M:o_ߕ{& 5 nQ=Xip\'iL\meS]'!xmj*zEMƔᕞTRm{7zRM.lZ/2ztKV׍s.]|{knU m"0S'!G$ mxv'>{_~K# >S==G[/r,򣭪9{g5EK>S)tN"DoGZZ[L=%ÇwT.ߎ;1&{oew'ˋ~P6d|6l8v31tv.rpۢb 쀛h " Cm0EY6qk2KH` fLY`i2DhJќD>Գ6 h>0 GpHY+C7M=kGɝuA|3q q> `Zu bz /7#0*Gb6ם%c1~CRw]kqMl2YVfhO2xe\%i.ZJ< /&|D~bؾ{۲(W8f)m;详:S, f&oK'ik{Ҿ,D ?]NU􋛕dK.'`nA_H4 >a(hr qvӸ e8ZgLV4 ܝƂ?/T\]e G 04Hw:Xu"Z)Jh^4 ~pr ;oWeɷryc6#tϋIdmpޟ $3i14Jon!; @o=3[?"+|+&,eL?M@vs@`Xy8bFrHQR#nNRVT ufp;- њq\+k6ZZS+8fHg]o[ )=t9`3]3]fpM]\(c60-SO-g?p苶a1B/7d *̍|(1S7BA{ X,MIst|HŘ.`%нGuJ  b/7e 611t٢mCkZxBn\aRl࡙|}LU:6ǏGiݵs뽮r&Bm/wnFşeAKJruf!]0}9PCjLr +\b ^gLHƫHj}}E@t _T1lnE7?`D=(x¾,Z$ʚwsiQa Z_pj?й7Ņ4l2Ӳ:_^#Ch`i~7o_4Q+U@~Z̐pl"v(bk\:oۗ")j~M< pp#SfLo@Y;}Mϊ@ro('i8^ZjR2`:fTmcct*VvbE?REM$DB6Nt2V֓oS/?9gN˯{̯s/M2JʽmDp+kwIH=!Q$~1^)cC℁TWGwo }̧oIGs5jOբn4棯(+2-=\Eā%(clWL!Į78[tfl \K7_ʨGUVXfyjԟ_xeXFX_~-EAh5;$1Dh<5+~-)_O;'&Ō&st[r?wt&f/M("hR^͓Nnu7"od(D i~,J@%"ϳi2|k"kgɴF6`˃Yffzo]QL^W/˒G Av (ܭsbt٦6œb N߅_~_&E2(QblA'UIᔷ|K~2 #/ }΋Wk_RqdՀWoBEfO!h4f0 w\^A;0rWj/Z( 9 Nbj)HgdA<YyЬFJhs"ʷA+nmB1Kjf<ѤgoEE*RqkB8yVwĸP&NPQ""ehJFڙVje+cDQ7I)ã0#vd02XR>[e( {per l $|`]+v9`nVlf)i_wȎ=-Ϡ~Sfw>Y+#OceY8y JI $sqS~I/**FQ7t3V.E+-@2[1ScAZϨڟрn-[ed7s!ʝŽ凎XھHl_iOVՆb?ooݪwi\x"q{U)/{y WiM|! fy TY+?|}_0dYؘKU8f3VbVI8yw6`2vڍrJ]U]/hՅdrS'韋MZ3W/(n;1$>_Lapc(cq4ӂߢX8QbTleHuXpEO+8p[+5_7fUAlYxB.]cx<[L;õQE"̛{D~@: A m~ L.,|b{zrl~}_gǘmAwVJ¸Xx奲.fe&ӫy?,߬nxpŊ ӼzTyolZ<KEOOד=ށ $f(/b|T9E]-nRyA-aW˼vbY&9 C>|M>=-+I?` b%K#aAA)By_]X^4;}t)l{c`+&;%c—ۙ1z,]ٺ%AoJb}v՞]gWU{v՞]gWU{v+m$Ie'5}{^f7RSl%jEA(~qhiKmasifڟL;HS|~O 6.oih͠{}+'~ZfZk.t<+Lk颐,1uCj'LX@(Ap%(yk`I Z] ^kt9#  ``gGFd!@# 9|@nwAJ˛]ϻ[ $|Ssp)D\p}~WrK m|ԕoŭTe5&DiJ?-X*ps<J@??v/r/Il.*pGomy.~9c_=Xg `jЍ҂Fmt.SV#_E*^~-gk\_Ebb"[̋o Nr>&F^(b짹45}s#ãyNy\I+h{$^{ ˭bܯu{ƕ1>"S#_F1V_--ׇ+>(F^(b_MVŬW.KmY;<ٲ%~7kg*V^kңDqԺOZCm)#m,Zv9Xt#Cܶi^J?Z/ey/AUx W5r/F( ZPǗly[h̊.[FdEu.xGy\uEFp-OdNTgcř6Q>SV#_Q1VW_-tuWb upgo{>sПyj5ƌZQs֮WO<||Em 0#B{N.Lr](̑bģÚbLt{+q}rWu)L—6xb,DP`Qy28 [\)v&FEd/@1s<5A:>+<T(;+Crcۢ5 [."QFaG1<AΆ*'^N/ܟ/1-&^‚A[ >C{ aщyڭ]ZOzH3KyV3$]>]B[諠TqpF]ws7"9r\U $u 5dg4tM`I)HE-p¥Y#>|&L?~^>ߓ37!H5%)t_n ܎TR0ܒb\ogvF_L=k'LH uKH4dsO{e'{C>rMPϓc5椄.)DI< Lx\J Pc@# uNz4u}ƺl.j@D]>LBYTi#wU\MqQQr )Ni﮳~ҖH q "O33/Ύ!/?}3|&̒ݗMgB5:0lvLfՓ&-Qy`Cq3=ggD=!_>1G qo8Q+Ӿ(zTZkZ=l80˵X"cmuDeI!å#!)G\oS▜q9'3G1sa6漉owSReΙM[CQyj]ldEko?|͐Ѡc^s rC׀iSҴEƄ3}FUqۑ*L&6wu@vq _KrZƀM)$" (`[Fkhno&8;d*%5mDnSsZ>6xTsB jnՄ5s.Ƶ9FbHSLp1Os PsaV%PIi&`- Q_%E>]k.jZI8"aPЊ՟^M4jT-3s3(֨ыZ^0A; Pi#2NEf,A!,`<# RwU0p1'#2af|&}V?aBx#l\"tƄ5 @7RȀoo .(JT5FB-j ^1'}=JHny4.@/l*5JnDr 8g6whe)- & R4[ g̋~{vۻݝ7zXuB.$VPsҠ/ L9A<8tQsظoglٹ/K4FO򊡑2FpNO|:ɚw sLpZ-W5|p#_׎jJVj Qgia\mp%  x (W y +loԖ̿ nZn{%(MJTTU_Tl5Usk-Mju:8'>{by %30T>˓Z1OYb+Y[Sx`6U2c)w3d&,V8:GO"\yZ&GPsFd4<0M֜3]Y.*KkKYnƺwm*/O`UE+vܦL~Av5$_pr^ӆ=iEM;~F{W4h)V5Uz U.ޭ/*%sDFy 3hTCpY&o|9̣5\t.|0U5"'%XBϫz(Qe0|1ԧjS"I&$̪@WfYF ,2"C" )+uvkMɊMR.z7".oaR Vz.|4qὊL\b!ٸ󴢃cxy" y{ܵ>Vϙ7|jx94E NS,AU%%@G% V⹡U" a|p3a, -CnrnʸCDň.ev}e;-_e_vGԈ JTӖnPS/588UيuVM&|m]obn@[^bLȒ${h8K9hCTK aZCpbcd}6 `B_ (X89`x2qykZ1y6b/2:LVֹ5V7ξa[ mw}iB{#5Դ2Y̴9o0h}{,3 er :n rPIEr:#! M&50 c9|.W̢.67Mu?\qdjR/Wۢ"ޔV;rw^Քv5{eO+&ڥ6{[bMP0xAF1LEυb䰁>RejCW9qCL |#aE>]UqYsYZɷ8*_sw |N.mUQTThk͗wSJ65G#2.AҺ~NK^T`'" 2+N2/аlE>Ԋt`?':{m:M$,wgc8l(J!F<;E>bSèLJ|gք[B"σ!%hMkGdQsAXx6$'xʸ-“@_ѻ|e̿OKEg] $m}w7JEmz2 4ikQU/޳69WT[Y< `I݇ pFJ+Q3@D) Af]M C%$^*M:$+qbC@N?VkֽΜYͫ <\|a@{L>T$$ PGeߝO mJi$NԘcMi0N)t$;D~*14HZe>Owd(r(_#k3dr#8NE&*;FS;J˸=Ӕc`d08O?l6Gy_d+m=vS.o$3%ҡ-8FMr{̊dEڇq+1IU2;Ο"3!>\|DT^#NG\0xl;g\#fIzga`W)cCi?ͪz6C.N$>=]3 pn~5>J"v7DUvQE&eIUb͒㎂;FӀJtY١# W=)RޥC#q4kt-5QmEWGxRU.z0*까۩T61/Ow[Ck4YDZ+fPnao/)'yg柿2U],M9{qIT=SWO6]=SeH8Ы1?Ix{68z& ժqXz~.]ɽJ'_sU`<2z"g\گmjB6 ǭg .MaN~=x%Գ^ أW(֒"foBIZpKJngӢ#@.v=ӇW3̯Q1oT2O))Ao^_7.!9[pgq:np3q.zq1/;Xb7ptpW`&icX};mkj}֠Or0άbq>-8z&}C|/_tGIn׋o ';I6G6&"Jt7PvR'bn_1io;| Sf[t/TQ`waUGg~Z3u>; /Tyv۟ѐk\X}4g@~pp~׳ V_hfV{oQ>ثL"ow3[;SKvv@?bwgv1;n5/NPrG]$dٝ'?Wz6t:d`>;o.Wu9Ô"$[7u]ԋ&K?cߥ` pV'WUe'o'|\}Pyctp*?0rK^OMW!#q#m7wʟ|yV F'}U3kp  3 _/}7<<'Iy-wmyچotXeĻɏ~v,wo*Ӓ~)o^o7b]%W)w"ثmy2h<]]QCړ0՞FϮbZsg?DI/qGaӾ{;/Os9^GjOfո^|" -`8é-M~<럷P&J16퉋H0OnW휫~d@ܪО|uQ:{VgD5=k: WM6a=1r-*`kVH%"ʅ: q ƿSs`տ3L(8p`^ʼ.Bnmc*QpDYzeƀ-ߠ] !nWV˧ w !dAw9eHLRQP b pJ}I]-} ad@mz~ ģj=L+6رƺ "9%ϯQM!j#1 4jAV1 yjTH.D0&0N%%g_mh ]1G9%<9â=PLx 6)|?ڭ[oBxzpy)Y>H2\gr!h&QfZˆak6"d}06*Oq+N sxB54G)LYosL2$L S qd]..zpC!^2-bA/Q`r0Gkm QobAB*̪Df0>OG0Gk G!vC9Zoc*GvYiEOu8?Uޖ orCC0s uC/?"檀^j/\x%A8PcWG`:\N0ـ9:!:$2V݉rT\'F"EG=L(lT6] 0[h_ cGV!+(&$5 D6r14ks%>pG]֟a^CG|1aN}.bOn=i@"N}o&6$$j%H>RXR\-ḘI42,7D'(%&AR+\f1ǽەѓ|V6N[dΔ륻?[,=eC$Ix<==T)W փj@BXDiYCԹ3OIÄIȜ@f^an>n羴 nv]I#kPq<^[ Kp>-w41h?m UwzCSvy&$fzxkZN4QRUfSWhm`vqk7=6ȣKN9[bEsN=DT!4 od`(Tͩ#L8_(z eb. xq[ch;pC1 #V/e֕H%S&8l58K Xw)!mۣ~#5Q"b( Qx`_*Vf( ŅGP0Mt"#y3R[^EzzIrhv1~.:4 ۬m}6ޘ.lcJFvf-]CG=GUcÛ+촸߱wʸGf`E]l5F=Ma(|U.L}{ nߋ %Bʦ(a8R6#g3"Ш \5Ǜ ep3dF.a?d?,sKz=UN&zRrǰ aTLٮkRp??OƏ+?v.f<]@Iq0MTh;jR/]9'3.On9 3'~jfWi=r6]y>]"!5g =/`~i V~7Cac*D<6Sjri!Xn$s2ɵY7d@#,mYwUOVnV""#Jr;3 0':@:@+\9aʡXXwg)/CCz&7百d} x>pa{/sԎWk21D:"[~^=\Rt!Wo!T1Y+B_hh`CRG3#u!+rh/þs1J&(xd|1:z6xjd}(;XXhF遀܊Ca͟SoȞ뤉ijDf{q>%,bbRu)^<@,pi/٥lk_r>hWc054'4ID`I <ϒgy3LGLJ Q[R:(Ylqurlmnï`[>x| m akSŘ|7byuwQ#T;u!ߕ `Qe5c?<\"6o4F2qr&URy„\Ո eGZʌr{㸍X4%da7AfA@nّ,OQ.m6C9s FT",^bD]lcu&5Svo[#jC-zװ1Ry={ .6 1\$|wA~a C%Mc,(NSL0\8CIdLFaritӥd[sx!1 |f`*0TD&L`-^aѽQ~xEOGVǁLE"8KQ 4i{ KL#'jmT[l'`gc}<z j(XiUM&Dl^J -{`cnr1 Cνk3tUTv9Orb朅E"ٟ"\ @%jwQ(]QCFgr" RT)S` ~vm%YU-a-;+ A^YUqm׋upKeW3[䵪-߽{熹:~YIB`*7T:߿rM}I)my}alfQD?Tǵ*ae~||zSVPM-|Cjd!C/_oܚr[ţ Zz[߼]oAo|ϷjY{zf7#=,lp;wuaG_뵯Nse cluy.IՕF}tkP? ^-2zp4{=[9W17(&)Mg"Rf48LXGi$$Ydqמ^89m?+@#?-`*"6[߫qY.Gvng]O_R;[h0 CT93;Ƙ=:^7yǣodS~9?!7kl hʚp}V6ִC\>siLPk-fv8kUͻB6hQ x6\zvŹ76{˦b8|~Y#s{U\|ϗ׻enuI?ey~aw>C[FPoP4hC\-gib@aM Fr #dhWm%ڴ0e/:&\ٽ.JiBU.oX_t#%P_+XWuKW={V1鍛˹UC)l&K00gυC}N~aOi|1%H3q d ՉV$sw2P:] ݇ƻ7/JasDp5xpԛ\|\| }%K t0SD0\Αi!)&HJ#xU$"˗f7_pQùkyDfYދlr8F:D4Œ_I洞$uu/#M'LB". _|L u8z i'L۳.q6&!1!$`^9aB"Q" YLhrFcWZ5uQ_CU1DI2˛ uȨT',;d} vBx4 uG"ᅋ'qpwi]~nkiCmI$nYV?ǥqw;*6c(E v-h)e%3hڪFGk0Q+ˣ⯏T" O!l|pZZ)2v9&SIOA\:N,]8aD@?UgyOҶvnu=ofM~~U.y^0L@ '$ޫ}C<8dq_Kz'n=g֗/[O3dJ33K1(,GB&?fB?ާnz_n=a4b&{!h)_o02Mo|HW1! FCJȘ$KYd($e3"ޓhu$Wh u' $A+цI$*?3LRWKPڗb쨽}xS,upy,"@PHɞSQÃ"WA`b{x>[< LaY*a=#(dl;Bd 80ҥ[Ew=\x.|./N"wITlL 7 ܧ> ‡‰PͶD`ϫ7ǘ@AVRojj D,}*Gևxz״{A3c2K"d#!D)K%bCũqNV绳u1>a8J+o6(߷Dv_;tAR$`n!,ɚ,bHjq M22Df`Ep~գՇI]<&Qk ,#5F 9syl9__Btnk< *ׂ{*{R_٘(ɃX!@θ$;"'2dL{!.DB." Wp8CQ5| &NyS?5O6! N%ՠR I0u*4/Y٘drDS&p/8 b &e`Fi\qnQ_o7!>xm>: Oa$PY'P*u2ʄƚX= uOQ_OC}٘d8RtoNt%q|P_\2'A߃^-n, /ϜwQX܂rͧC|[`RxDv{x鯛4=٘b'GgG=^)aS4 'u EgJ %?@nL=L}j6&^FP^X*KH!X(Ma[ 8>=ԧ%c8C`C{58/" W;;u|覙&P8Cyx2efwyKeآҷ.u-6 Ϸyƒ<<(8L␠>n̫hC|)gcJ"UiB#!v)8t5O=Exà havvJi.Vć| 2Q'K{N5upɎ1ˀ#l>i$ 8tF_KKn$z=ɇW SH`_\~,+{e}RzcM1|5!/8-"EDZwƸ,6 Ԟ>׭GC|g`XNeԟae{XGoT5vKfp?5ySUm5ﮚm!xfNW:M.NOmP.hmذfdnl$ݸ~Ň5;A-m#Z6} ^{. ^yWȵV?%f^kv Sz*j@cF 2*X4w7N|^c3a~nժ>ppidnhvr &=G=2G}5^:IމxDD|#{ΝH\ډ*U@7ߌ;@ I92È"$GRV5UftL@2k"δppl? É}nN~# s| ؽF'X @mܫ[xtnE[쎧?9ZGpP)$Q1MM"8RQFR+h TJqbTg 706Vii}Mvw=8.869:.jc{ǹR6. g{xOk쨪F؄zE+ms T*Tnm~ C<}N$th% uWܣ';l?eI,Tip)8:b]Dj ۓ-]l>8Tx<^KDD0d,iAZ.k./ݨu";J$StzTB?PǗ3HO(3:ŢD8x+QL/sP u3Q_3Cٰ1.'JT(ɚ"p ZiD,Q󞥺hCJػ6lW>LT\b\g $0jhQ$CR=DdT-V5/D5t]u-no`~ZH6YewVo6<}1츍PE_/}v6zuu_߽UV]F5vw9lo{'}{!K딟5^*gNSl&mUDpgwTi2בF5^w'a6E= f"R.綳^am]:k= _ґS=܋tcD,sG2>WKIy-+0o1?-z#۪#ߥ<=Nݷ_ҝVٝ߼'9}r\9i,g;=‰2҈*qPI'`0X*Al5gvo 8?|o=^zOW 7oO_?ڗH+L:}>_l,<墓>^N]?}8eQꃔ4Y&Գq0VHtx$1:d8Ӊ<輪I˯]oA |#.qְ#X;!+3(m.'KdXf kq$ƆFlqm- `YxL6g!tzbcKxTR31qĩc %V`Zu` l%4G~3H\Hu)P =Pn b:Cmr'~q}1mcƈP} hs gpyiq9[]T{8 4iGu(M˷GC18B`~<*c) R*Tmxˈl{XSw͕bc%=vsd'&+OL=4,\HNYCKN!`PQ[B574S!tG˂RπƺmAjGXA=D_Y}:6 tqN 7Z)*ƟͤUXu(PQDlmh4 7ہ?\qX`DdM 3,C:倾'-xU}̚ϕcmin!#y&YD0v9A Ud2SkQB6BhDq[p3\{ TGOnT7iq|5oD.=ܤ@vv)DJ)`^oz"FcpHMq$cBYdYՅlc@. N0QkVDa "2%ыH  G16>O{/K qGxۏ'G֩(8K&K@xG6C"!iiACL%ȀfX t7DĢ0VѤJ_!'uB"Pt{"cri{Nn<8 EXs`ѱ08H1V"Et@q|7?X_7ѢG2r P/ 0lĬE00vh =( A>6Vo H4ZYR`|9.Z7Wo H}meX V~),-R2 R:f*J2f}L? j;νzyw{úA}Tj%4ac2VHR?7֟`dpi$+2'B9 h z@OsC6ٽ~c\n7^=&-( sa(ώ0,-Ҡ ʅ =Z1 X J0PJu@"E 55w3'LE? ߦ>1+ƈRpT<*4vCF+xH׽:HF\BfIJqBa s~g]07.K [y"hc9(7mf+hT5_$h|`Y2puv>.W%{h(5Jp5.y !)QjX}S4U.z7h zdy~aғUX1(~ruKD8;F*J2OX;F[p>)ن=GBLj(r%R;o /?>u΁2kG )vPM0셍ۑ5O^MZ2;g!*ua`kD z,ցe[n07gGn {h4GLήg}zDh N.zO=)Rh fك]C-8 cE퀇4 9܋Ax6Z,_RYVTdDI}&fzkJ"y{h4 R3D>{t{;yF̃a$`9}1P7|ҽnTZN>[*DGgyK}%^G6FBKҔA- U}{t"`O,AZIEK/gSnW l8)(z #DBh7{CF r~C.kS_6}F*BCCC0bM 92J|%ܖ{|9G3: A[=kjc ?z}L1ADG8.Sq⿝Ng'{d1&-1ng7_=-|ЍMos('8^ FE gt'CmȢ-[7~vI+)C18 _ r4 (PBNtZ(d/*yvc0*x+Lj]^pNQ4s҂\D,&T'fÂ%^[7ƥBz.. fYewh _ʹz|ZmyxjQC=4XE~E5;o r`.ܓo8. qjfE̸0Lj I4.;? A`#l_a" dB5V# dZQxMs`ҁ\,}VRc81A:p-u}ޘO=ꁊ`C18BI)*4C-8G>GFcp1bpO:ȹ8D=`dcP8]'XtEE?>'?У$J!:gr0ٮӀ%7 thD{h4GՁCZAk0‹uY {h4GkQ$f ƪk]ЧD$ua vǛlLq _i',_b4_'q>s,NSvOyh=졻/Pt4ތiwn9q_; _4y|#k)o~D7%b%NUh+qjF5^-+#?9زAlr3]P[FS/Y}2}=Zh 2t;7B]}#**ho Iފ 7qIDҍ0Q`޶G/'k7`xlsB]h4voiyXwڥ2K=.a!# >=:e s=#c(^WL Wp 2ԏqt9Η7[D]ES{h4GS+p% zj*VSah 9̽:a6 (8U CYŇBhDB-hp)@*XȂ"QԱC^jZ`#|iInS=RR)Y]:p 7'!93j~Ȉh%^hG ;dfӰ S]gb9WwRGWm\f-$HV.NOv'Ww%)`3"P9:r+u~V~Xq8^QfY]cw,n6||7}XQ7XЎGbwI>P~js;hlws;Yty7|XC"bm˟?//?{t 6wӏ Ka9ƃ]~oCX"~_<^# 5 #x.-u$ .քx]L,IE)1AbX'-u?Ow3߼7o>>e_&v:`X]E꾛FߖǓx|w2ᷫq1=AN0}C0Q4O/%c -"s⡔ukĬ?=/ ј1,"&(9ll%1qu&z<C9xrPmx[I3E2kMC9$΁w%g3澀eM~aq#Ѻa+4o487"sg\X!34X`D$< q ZBg*ŇȺC@dX$MtJqJLu*%%\2N5Ü4$3 $ cY`_}Cbnq3YUީ/LQ9/_5ɇyWi"ڕ y>'fufX)S3lѯ7`mv8[n-$A̕e%1.v41(O|B23ŝ_qbp}.f` n-ܖeG0^,R8qtw*O?]f_)T  ܼ55A//!1c_2_q yL4vuq a"-gvi 71PeBH m%zZ?߅30E 6tpkocՈqV+ZgDbzۋg $̆lV20db $FZKrg[؆SZT[FjG1c H؄r^b&I VYr'RGJTm_y8kƟV_Q{=[.{!0Ԇ-'Ǿ]x,yh@&- hvwI]&hU%&jJ'sPc*R!ű!#uYq2ugkhgjy )9&S)xpD ʽ H&^R,WTtc?aYj>FŵŻ]+چ"0bl:2!SʂϠa #)w/i$ϯl- Ϣ* (" ]Xg͊^FfF5,/+ )OVzfסMXLkbKvih }G_|+o7 hgt#ym"ElA"C[d=1.Ӥg]u/칑gJ g,'9krU2rk 09X7.c,Ĥ Be<5((Y[a<<S1w,=F"d]%`Xu,γ(e\{RAyG0hQ`~oW/;<7e_O ́[ph~ &rY4ϰWܻ:%#}ѻFAۀ(#Ŷ^zlٚkwc/v]!ݮڷ DHFoCz}`z?.6hL=)WvZOr[/k k{,MP&YQp[7oT |X d8} /}~˫wߟ?u~?_h:}~:jVk]k+Wu9-02~-@ڶDYqb'>tr9Xȧ|,] }k,+0lFk \R6BÀ@Gs F,L++1KA:&[&K4Բ$ D)Vj8D:4]pm~1D#֤5|k%[~;U>=0&U#̵| NS3pNՈ ,N.C6 RNrr7ɾĊZz ;ۓ{J˚~Q( ȹZ^"C%4h J@ښ6iۍw|v#f\f(Qɸ`h'[GƧ+ )1sRW1`o~3JZ槻 RSXlL [Q+D?bE|מC'{ц~=9 Hi$ ]Fy@VI"edwK4"wO1(9!<䘇crCy )!RCy`bRCyH1)!<bRCyH1)!<bRCyH1)!<bRCyH1)ǧ!^ɠem#GTl/>9#! PTa/ 5W+/`x U&!J_9Ӏz2$2䥝9kU#((qP +~y[`sP4O=#,/?ATZ!6 3f^YH=Gw~x"3C P)%C P<%51top]>ܗ<#}f&g f)k1S*5J!q$w2Ê kvLڱFX@x؎v~82DC~CP >CP >Q<,CP >h@|P@bZ!|!%Ũ41(uV0wb|ይźI m/`"#r[Blx1xv`с5 <;.* T& 2T.(ǤPL)13Z`K;&K˜ 0qFZ|ჶ \p!f k('m4=|ߦ @h3"!}k@OFA/?8V2%" .:E9i诧tkٓTJ L,-yAk'Lګ֯?`9^XFb9Ơ\ - /d,WQA?J$Ey}GQxϓJHzћ?r6xZroYaYu tfVG1sSFIWke4.*Ai+pvIݽݝq[m$`n݌ aj݀ntǽt򳻣odɶajp!BP45 ga+_LrOP{m&ݳmi#r6r0唑8x|w3s]}V,ݦ+QB5_]Ѝ].݅"Ny )>Z"۴=ǖMF 0!9d{=StY=KQ6<"ZfD^U9FU_ W#y:-dHXjl2/_MIGñcc"LE=@TX. HJYE,:a8 eϭ*>A\Zӫ7&8}Y^BaQ~=.;Op:ߪjh%{zH2yB]xMQHA“ *~Fum^& ͫ_ת%'n/M]\ 6-} Y;q0 ̤\l%teQQNaUz0LowOeʝ^ [|ɫ'VOkf';);>&@E_B|3^-0ZmKt%2i^yFr!(/&;fLA8 Bz|G^U;@b/v([k5QX&vt;=ٯJg?YL!`n쟝OB08K_d9)JI7M<.z=閶b]XFi湪TlX^#́JrcCEAw%EM,HKzK*vFs D.@8@ӱa:@MQװgȥ]{*tZRZ7>ilmZ#YȮ:-${YHȭgGYW f6;dUa){-{¶uo! nr,85\˂ba|jOqs7V_cCNo-^w&i_omʶC^mt3-OXH^ZfAlERzƳXnÿf<чشu%"ooׇ4/\^M"iQ]> .r\jbL3Y9̌])cWj )OQb{<o"KB%Y[.'eE劷O +G\[M%r1Hx9>LgO=hPWqPT>KSCXϢHs/c#>^9Y8'X!/Z?dRɫ\Š_g,ԏ#Va >HA)fž fW`.\w*a7B;.x R˽s7fӨ##~Mo!/Mi&'}|5*[x|V&zfpo㿯">`*BUVU(LC%,F-OWH^^ve_/3ʺya(4{oɡ0ˋ c v" ~%#}? w߽@ۀȨj+tJUjR[Z^ }{[n[ f[$DȪw76ea53=SЇJ^Ӌ6ԺN\# `c V]:]TKNhΪdzZBo)x d(H:_Oq8}8֖ڮDM__bqCi&<gh4O#i4Q@T|g9Gm)8+Ag4kҺVz.#q> LbhgU eP#/];~f߼~}7oO1Q'_5X_ \UW>_Aw~;wVa-֤Oo3yż-@m-ן^ףYݜL..cRs,K\Z͕ `ѳtk_,nw sVNc=Q_Dbʴ3g A`[HlAǃ/4Բ"f DBfY+cN>\]kc !֥e㿰&0L`qnՒzdq*H9\k rG1cZ4zm  HRzIˎۛOQK/dڜSYVwޥIXIZ+[6E%ԟnEiXzXz$Ie72.~Nj':Κ!1ڦ%zt UVƂ lB<\pxv_|EoYg p) 9&rSPmCsM,AX+@B"HB8:=qHVY1϶*7 }p`{p;ܗlO8l3wt*]Y =D6P`i?]D%OasLvSDcK)bμ+( aS$D~HL<6$7C"Mq \J<݇/>7}Ih6X1g:,HKs9` X?b]1)kS5b]|jm3QC&]3Xp\ƃuv29{,Y fx,3> uow[g|w'^`VG; 7'\QylT?7*,%rI\oLU !CI$ "yAwdԦ zBFz##-~tϬ+eMc\.%QnjĒ(WmBZpH{|־QW u=+u<ꇤ/k gB!DŽ=1ACm8tVp kr3Y͇^6osF 0ק߃alx}~e[v-,|zÌcx)B)B)B b)䘐Q 9c 9c 9cBZ@ZRkR1rL!rL!rL!r|PO!h!ǵw+OQO 79h}3o(B"%dhFtx%_A{Km{Y۰CfxuF mhvB2g\ada _FZK-pźpښԞ b,25FC_^NAZ1+%ICr +@T@0prT*8$nTr;Oyuzz0amZkpNS \S!Wre5a!{όFH#Av3.izLM{kbAJS#Ԉ<5"OS#Ԉ<5"OȟY#>RkCdyE8OB0S<8P.iS`, 3o;EaξL.71=u~OS=u~OSg[JSf}j-Z˧|j-Z'HR10KJSԻ>OSԻ=pgg6xK Qpsʙ74<K?K/UG}n6FH%#/760TqQGֽ\_\r6E,ZnഫϲxxXitUqU w'ū6!fSe_j%7de*0ŬwL1K Ӷ5=$ ۊ0JZzo|Zk)Xiy}-E@6~G'aBtj ]6w)1Z V+m9ʫexZLE_WTJ޻696Ë kU ^.CIJ]ࠄM|uhU4p?n۟=w+$9^m`c*o蛻~*7RFĀZW2̶RoM'0lbvrt|Ҧw!0ނЅ tl~(my$K:R]\} tO)~5 Ot KCWQ,)l?ꬨ[&/gI?fߵ,_{.mf*=G#ٶ? |Lb# A닏wm?zf]Q {0&tm}oFpvkQ_i/V=Yds+ƣ]3{?f$xb_SKu=\'磊{/)zP4` ѓJVI%aTrIo[nVɍ7kfgΖ^Lg+5bo~av4Q&L,ʒ͙ !C@ |M'Wգw>7<8'D+t^xA5MU}vr53cg'*0x}*Z4'coXagL19tVUE(,j)<2>Yxeiԟϭ܈dq빻9}+/s b6+&Tķ5RN4w¾ꀧUX#jGx,W"=odR!sƐU`QL8[DwJ%8 E*WBFɽӖ])U}FQa T:,Dvd6#A<#M [* JJrY #A&Em Ƌ3ϥdB}X@բ](a^NIJ._vB38eM. 0O0/o f1l@'kcB 5ř R ͅ 9a%kQFehcB 7w\,L=9 1`U\ m:߁B &0.p)ϴ$qs*69q B`K6✁ b`, A[| %GZ|UsE(I#By g(( <l߅B yTFd\i81RЖBufi7F}9ħڂ%\(`O;&LrR[| %%'`ktrKY9f+Ŝ@U)*-Ё|||)}]ŋhTZ{tFB f(M Hkg.TluwPyɌ.μf8%ȁ n]i m̷S(¼ƥ@)W 52FQĬE0m](`^iSX|YT*)Ȱ Bc}K k4>ZMGRqL& 79^1߁B .5T$ϡ"- gUu~P(¼T4-njaVYʅ -:߅Bf8 AB>@iȎJcpL)ύ-wPyMKXLNo P6\{mm"ʓBKEh@hU:ZĈ$KicSw"pΈ%ow"O烥UK؅B*p2Y$+THlvPy*μ`"1:`-E|TJm{](`s]ZQ i-u %Vxi;BBZt2kG>L" {i-ź](`^u,.X`134O^:[.J0hq1QJ Rh 2VUh+<6lf+Cvr$No&-77wE>`[aʼ՗ӉN'qgy_'hGW``8 yuz h05;_ߚѵK^u}w#󎦥fֿ|/߼*iY室Td}WW͕p;4@;>?iCOƧ}Ӿio|7>ƧO0sCZx&Ȑ )82Q08;ى̑Gs]ԌjF \,WI]k6կ`o-l2,r囜5Ucw~+s&,)%=^8L$E8+X .#_l V$Ub<"6FB)+X8Μ6>ҵ1& rSP`panܛrhҾv׎LkoHo5 i 9D+;Ie;fzd2)gAFWpeFt*$L@6X %([t &|'AMt!੒*%X\iWIi-OpV9d"k=|>> ֲIYC+k7:e r.z)Z; ry~E)k<6X ϐ{\S呥"U;cO!h^hB?Lpvޒb5qK_2T*^/R(4[r-gDr ɇ)8`I/<kDCYfg:T1ۦ0[m֘Y.}!) tpd0?YpA4n8ZcH'JU n 2q3ȷ~bC :{X]ڐW V_szMW!s OFyZS06ָ1K֦kBn (ՔQ OW9u?5|[|:8=[Z pF2Z۵Dlx8~15=G5 =b;Y7!'4=;'G =[9^cT:*G]0k]")]:9Б<f2ݮg`R3d@Ispˎe!*A&~i ו^8~o޽>{߾{;B޽%x/`F;`OGM@3[C݇njJ -0%/xquIw{[}Z)mҷ ^ëďgQC\u8Ta#0mPeygc+M3L 0 +/\Wî\{X1HE㤺`{{DnJ7"vxu#RlvW~n9¸N{Iql)zCsiʝSF 郺OYgTc>p~Ht>]}ybqueo5T͟EWߵ2}<0_ܷ%.@]"E%( 'A |~U.u=)u[>{UkQ0:z&|BI\!b"9x) 3lV =*nKY53=cAPl4͗mu={ ǕbOO7?@Bx$hLYAZ#α@.j_T7? ^b&rSĥN[qȯiÙ&QGB('a|3Z<[>OUhI س1IB<2d#C `)eww~Yux}k;4Y1@rN>?ի~ͺKxzTY/d7nj:2'$W5_ɫ43FU/! &U_GX _Wӏ=TV7usW*Imgݐٹvv{Y5jAXWkf/_sʏ`Wa >cZ_)S1z+?y$7Dg6}G~}G~}G~}G~}7GGƾشGF}i_lŦ}i_l#HM{Ŧ}i_lŦ}SzBjRNAZNC[1Z*'.o~?r!1Ƣ`=QˊNW'抉r=wFOwM,y$9ю1pam؍0Kk~ ~a/ODBh~tw۟cr_{Z_y-#= }}kC ]ڧz#5lY/FG~~3Qۭdr;KUVO6?VeD0Zʦ&w!S7q`t?lsRzWγw]2vƉo"B<^i93ţ ˴A-!~~|U|, uY* &f+RS h{]s7m%me_k^T%=u2z]-_XB;XVWs$EYr݅D!L-0voewRX un0.՚T %7] »KL[_aF!)ݗLr=z]Tl{FH{B9EDM7B5VڮJ茯qms`BrX$dG룣.)JכtH "cGdc)#2+{޶VRь:G=5bSx嫷.;S3^^e1Ot-vMhc}Jire f|2SfL)r3EnWuiRW~ڹ}Ӷq9ێBBuX0c)1叫ϐ/toIW7A#AM-76t X.QH،i!8D]t4Fi:'k52LG%$ۊ화a{SFW+k<Ԯf . \of?B(B~ZANYg?ꅏp30:QDuX[Xs4]e{a?@"ׅQ:K T$@y3oDoTc19C řDI ?6Lrx;aEi D <\ & PaS8J,윇5w4#R"egRY( J3))RAP,@2AZHzjUB(|#ͺb6M_+b cj*(ef0}`&F;-WZHM$ZFaѐa#QD"{ςAc,R\:bȆԮ!5e<.W9,C;? q]B03_Z$`7lS_)eb$^J0>7.xb0꿃!9j)޻)WGT\LWbY-X+ JwL("s,Cߞ]upn8ӝot1Ь4lB\:$UX-LW0 =wR$- "f S _eK*MHQ}9inp?_bg+t894ֿMNUט5FŸVR4BRsT,~^OaF Z=y[>xu19?8^g70qA<}>Ɩ.@\LĥS3 Yd5D[b|uKfHs366,} K5*|88xfߛ9[%Vlj BfS?Lm$5A}>㐸SDwq1ntzqz秠'o_}_~{{LO'߾Uo0Kzg~I7/7D[MC{ӦA?7hZ6{]&6MEfnZ_.׽qux?OɬO:t f ?̼_ WE)߯Kṳr4 h elgW#ݲlY1[SaFiv/Mpz8/o@y vsO~yحsIM~l ֏'b7uSyc['ExӄS\JMD`BRKySё1"4'ۖƫ(sEJ%4hqT ՈF٠ !0؀Cq\慨]#R`mUX}\A9ؾ.GQ*ӳ0['ߊ-{Orw6JUrM'Ee"N{|Z4RRzf."\|*5NSG}b!@#D%_zՃV-ihg(PP|~+*-.&a16o_gz 92-އjKRQ8BԱ$|}s_Ɉ@ a`Ek򆒟w^/s굔S,Wnك N[x}QB^kIQG\NNQ/ӧE5թh04PWO\L8wax > Fy&jsKmc߽~'yۘ-Iވik%bꕤJ&#Lm:)~xT˴8^+}{}pQ^^ijikj %!FI0 @00mVU`u M{Rx0$⯡? 3˃]qZ1˛=0=O=<Ԯ}ʸ$hMcM<(~"!d.VWm,Y}x2nd{iM2I6&$sdƆmMdn̻M2I6&2I6EMʖWpyFn"bG:!ޱA3Ń\z5aKEȄOmБX`9auQ{Kܳ8 :Q?wmnu:#XdՏ{bY8ftt$z8I|WI礰0#PJm-#IUM0&32z6V*췞%t;ktܮv5<;=+mFZ{BZDֺ7MΨk󌺞꺩XOpӥL]*8ʝ7\ \`DŽ3cĜEK,tj >+Z75W;K1n4w?xW"`PoOPQpf,T*PeBYr]fʅf3 UfY2 Uf,T*PeBY2 Uf,T*PeB7u*PeBY2 UηZo*PBYBY2 UT@*ɤzY ;'o@&.?T,ХڕxJqTQܦ=:2,"#aJ'RHD#i9Ԛ8M׽c*ŷ4v֎wƠۏRX[c.Bu,-=UZf')QޠҤs9m=(rFaQ\tX0c)1–叫 ( oPXʃ6FH%'Qnld@$ x0 $ZIiX˸%T8 yS9KIS-ab"XZYH0-k&z>q4 ϢNr:#p^J0ser̘|2{S FZyQu H#4B>& H6(!y6VFoϤoݛ2p/&[*;,إ |7moĆeS6']zqCT,+`M!۩}NCXGߎ_zՃwxFOpG\ti-~{~+|_w1U7p`3~Xי~;3HYe*ox/_}d8>޼3ont/ɴB˾\zټIS)!||IK3 &eFuzX`jVVfSa&N}H"SV'A+ZJSKpP*aTRƒEs"Gpt{V/l/Fa7q9~NjJ K$K5^ڸ!hZQ0I p^˻gk$mI/J/Ζhsh8_(AlVsӱY'c -g3]w/TWЌ LB)St$,?e$ #AgxSf ó\~u9n?A ˋiZ<{[c\ +>I U(z h#A*7u(MNu\f7%pb'BSȜuƓh1'"Qq6]! %HΜ['(@EVk 7,-!+縷g%9i209Ma.A3*_v+V-Rf"vLfybXq*8CξkJy+lƢ, T?oau0pzñ' ȆT;P e6QJ|ˌ `Qx͂VL" KErl5k\+pe[5a(J C@.8OI,Jq۶t 5 6B'9F{a+eZaDmJ+C0!UXa&Cc& e4zl5ͭFiah՚*trҠK1r%TH2O" 2ԦXRL E}*ch)a]Z'1"0h3`ȝ06:f9RvaFcPfJyԫXB %P&A;1[a MF).DUqj|JF4L BLZjZX šQGFFmf%Lό ]FJ*0qTL` Z[r ^R[Åi'Sp)1izs(NVDF&7΄(9PaQZ(85޵qٿBU<N3lyȬX"5$e;T)Kkї#_w;N`<9au nz<)sNH\:hhM?`@R ƎqJ"rԾdJ)mplmk̽w}2u1qWCj ED)VX@;KnCzpC4*f7](_GWP#D ^H&^a,C>j6yЎ2v d`:Ո (N6G!cUmQ1hD#+;(@+"mR82Ggm]acmBg"0,6QM(vR-28\ɂ_a,p0",0Ljp#΢bt",U@k6֜ß?( cy!ud1M`J fB޼jPR\ ^`VAc-B@̤E@ |^ku ak C:[k:8l17y>[atZNe-Le6rVsMc&AmYk!nj!f=0S]5{KSpRh:Z kH繨)UFhn09 հ7G _6D,T@s1%ᑡ%\D[9jR˥P4]&t*T@y!UpGti'PO H@voܮȰn$k6F4U.X2Q )j8eۆsD<ʭ_D<Qp46j5fC1IԪXU?]|3 Xv)&t,ʬ"DK()`k%%Z7CF t^Bר"!w*;cQ70&H0ՄYLVYo|BPdgu;"]7ӣwvDkC]oЅTtż7ٕ6GovnP pN0O=(no@VgjV OJ`@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJOJ l(`'~J -S^ +9)^HY)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@/W b>);%Uno@g%*“@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H r@ƃ9)(,}J &I @II DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@z9J[QnջYjڞ^_v7o6_;}4?6P;¥[ ¥ۙk ngO#\?W[, z͎p)-ra"sQq>f47Jf9L{M ps!KiiUf{SIVK%m>J +JʹPIE%g@rQl+7TfϽb6+w҆*˩Y=*WGx=X :\5kzJr*W/\){O|+\5s]5k}Y鈏r%[nAla y?ٚV7ܵӈ7;_#Z?!з~&'/o,G3F|GwN\ `3mkwn'>'0qϽy[:?Y?|gHP|Wu5=|9?,p[JtT&}!4^v4|͗iZ=%}T:dM@~\Rog-`ov:%gf-VFo&ˏcfa46K- #cԻE5a;m@P"ݩ=|ݮhe1o+.RU}jjfM_iGpR9KiaʉЄx:g:Y֘}bsZP4l&^MuЊ4S`w^bc庬6a.z2k?|*~]ː ~?{QOau͒i“-F>{sr.O;w[˷_]WJg+ہ1Bqcu6: ߧZ;B;Jplg+c9bf2A(SOpcR<S֌"Q?j%_\2mhɔgsϕ s^ѵůcXցdjhg(&J.ٵmNfƸTAl(!Et0#l{F9(i,+ε5}\%[^ƽLyͷP~_͵u|ooDBh wDq׃ p=[Kp@a+[1ڌ Nt&7?3~Zi>OW6(Y׀y][h jf?#L!ײ;Fr'lrvеS$UVZ/]o^7f/8rxX0{mL\cތ;Lѭ4;o7^|V]}o|ԣbuX(_boy.$eiqo掁{?O|2nbl锏 Smז qMp[׈LJ˶'h F=L6{0?xϖ6G 8<ݛ7cSXtm ?D_x1mYn;#k6g/,bgן߭ޕq,2K z_  "y!*".Vo Iq!)qFK<XawMMuWKW,On܏a}x/ }iǼO,[60ϻl&W6<^V_ KwN,Ԙb*t 6޼yVdų…ŝwF(g? FmXJv pmo-z$ϩ{6\s)GJ"z9BR!Җ#  W;Hmާ6F!E[#e$n ֳy` ZGyj-PHȘ8P=tHbaijLoMzLmX>=i;̈́~?*~w, 3]]*m;|k>eJ gHeqP,򐤒5P3]>V=Ly]i5UY<[ AHRVa S띱 D佖&Z i3M4/CO ԁNߌ*Gy0,eXl%߱t Qd)1j Qc4ӮMίLNҶg:YM@F7hH8:']޳;߳Yzaw@A/Ny8oquṫvA{^мi;JWCIa7=pZQޠVSoWm* Ł IIe3`ն ^ 7,80P:Z:Smt!cۼ64V[^[<$ |ȟU4p!rftliS@y" `Sc&z#ul W)m⇩_j^T 6;Xn-\r84T+/]  ST6@ ˏ*z:R")S&!aQiƝV28XAPVdPuz,Ȅ?b! ak 5DqB\Y o0}`&"H(߶xjm,Me6 6;E$ ,,h!IXlEjAv!ջeW˲M.W=.KEpI>5#r:O 1h8nJa22.|2oQv,侧sN޻-eQ>edMR0b@rNZ"bf2 WӝLA ^uC_wWɊN !B9R2$Cz|~y CN>ldrE2p/_LFk';i La`e1Z>89vJ[lwg_+o gl W {o튣Dma6#fdn? ^1f3J5e3]UMCrUa9`(`bɇ͂ѓ9{W鮇+NJYY5kճj,Y:IX#ibpг 诘% lX2XT ߊzrzKo>?;s7}|{9&󳷯a80 w+IGQcؚ_ZS|0&u>̫|y67h{n Oه_ޤu:\`4ehd;XA~[eP(PCV[FA,"0( &R;JS@ |5`ZMsL(QmGsjzȰ,^ -J)#fSm7mjT[[SH;3psA +PN#/[HGa%)ŪS2.KEKsڇ9TW= Fd5elU!m;a=nmv k#h жtR*6Sjfs\o-Cmm3j;m۝o~kDJ@w2m]qEx{ؼQDz^VolMP/7XNq,oUv27 +ޛ?MbbYh {M7=7dlMl^S Efrgp.#NYY^]+|RXfNhu)uS.a}cn{?7Hr+x6_&2_M?{R13tMR-_cbi Ԑl-66G5"%\9O7NHF?tv:GM`Yx"Q,, %V3hІ%^6:.j$<}Y]ow'fYU_3?M8sUaFu!ri-92([pmBښԁ b,Ʋm]K튞Oq]?1QZˍcVK.AjeV P aS8J,YsI`Y:?(B]"HFǤ"9̰4N+ KVDh (O+@2|u=;WZ=b=D"016q@1NR+MF<F4!D+"kښrQZHI$ZFaӐa#QD"{ςAYֹbhR Mõ^^w8.W=.KŁ^pI>5l#r:Ň4Im }N70{|0Ŷeq%=EX>'yJ2(b c2&)PϺX1S0u/S5Id J3(! H ˥CRrrdrIy\') i"ciɝCL6׹sF&W$s.g_-k/OWՏ:4 L;.r1~٣1b6`-$/XR0)[{Yq{r3Ÿr*8&F+7e[oy(Q|M6fdn? ^1f31^>U4zI+Y3X> &.|8,=w՛z8*g%hY5,g$X41i8viW̒Gwq6,[z, *ЎAo`Aq9s%7ߟ9>=u~װ|9J$zçVa LIO=g!:brܠޚ/ J?ef{7ëQ?̣?)tkUK++@6,"7|6ޮ?wsCTߤhܪu_$i#;)O/,7t{tmz5Ki%8ƘaN>hRylUo#2he&zW6<^*KqzS^tݯ-'}-j/jO6s+m!#A*ɵWÕ]%ځ,\LRTbb,vQX8SV.[%V *{{ȧҌZ#DYŹd9 QT#4M5_b80l:r`P!{pd1,pё#LA9`פ~p0#*)dF Б|,-j#ɬi)#MrN͏+jj=eԂ>撚=luMKa*B{ ^; ?*k@RqXt h]</fdrH8./u^̐^eXI\(wp)B*j|sW,q5Yg]]!ǝL~|tg؅%dލmVϪޑT=AAw15A/2Xhyִ7~ާ36<vѧZw {ww}Q!?aǯn{(n!2zyy@'JH;qcva5q |Uu)lRTIgx g#,r, FW ~h.&J`:8tV*T T5No+md5, ['bhl[|]I )޽dE3Z; rr{i)Tv߷@EU F٠ [N4L{ 9o J J٨oJnb2Yr)cD0&EBFh}tV(,C锑&gD,cHL5Ӗ^^8mkC6 hf1"A X!b.&rx)Zs8i/~E+ kTxϯZ+xܶ$_iW .i Œ(6a=E^|"3hHW4vvuU} kÈ4BS~࠷jXB}(.א޵+|4.+2~3E:EXnǫZ}[r~')O]xҗ/O.^}>tv 7`|~bp?pv lK5D94}gDW:JT4 =pxS׵to3|it7lbBj J;19s K_Wv8ʥrNJjKJ!DҜ]Ksvi.٥9g.g,)94g挷]e,gҜ]KTsbG@CY@+ dY@+G8U:`L8%}rI\'%}rI\'%}rIl^ qcCy<\uV@y4W+lI~>ח3{9}^r!  ר]QQ8ҒPNbXXH&x1ٷ8As~Go|rdM(֢*brK%bdF/cL)lDNpҵtw)|9I-c5`/=.hd@aR'ZK(I fh/I:"=` @>eOH"88fa r>0G6Hc.,cYNHƴ @A QzHLdu4(@%LR#1s T2i%-KZ}Jڋ=E/2xs"ґB\? qkC^ 7% PJ- QMȘ!oy@ç>gx|Q]@_4Jg6!ʣYl8 b&z#u& 3aP_]X-7Xy.@MAXB,h)OBZdagв醺#)22dNF03,*͸J&pA$`O@cF u3(cm_+b c6# !TPre5a#LHGF; RZHN$ZFАa#QD"{ςAa,ReAjVV!0d `` ENuz\!B֕:.]kW Xsϗ<"aXTPB ;Ic@^J05.tQ -Uy2~Ry> Oi<؇bLAΒ3L))"gZt͝ tll0Msں=:ΊN քɥCRJ\rW{}(tkGBr1HF3zۨj'8vng?*)Қ1:>oA `c4Z?|&EWSU.O+?sc]ԯ'7#ZQ?znx3rV9{H%s|mucu}S-ӹquV7;vfꉮ=uVuCVw#nf^G YL'y۾O`gb+{%h}AuX5Lz(CHw_Hs|^xFZӨnխu{}w_ަ߽z\y uu˫wSap o% zgrt_mu]q[tIM6&߇̤ʠmZ [r|K,ERizjUJJWڮ gkP/UTeTjBj QCcIazc`5ۨr2&kG E-hZGEdQ0P rbR\ \"IQclVwgOѪ!x9μ(u o6Ԟ`V}(D..:\ fu:nJOp yj՛Ф܎!*ΨE&|i7m~n]pEQ3:\(@$ J+- I#\O#pB>c ڲ"ן|7D[n'oaU[$7Z,6wU&KҥpSp+p[l$K"^KAZ]aEvjV|x!*̹ᶢ<񖯾m%{+8M#.7FFOt}82 Ѽ1i\Mps֪v mg4zcmXmJ,~7a|ZXNgJ{8_C*^#NbOkdxQo!)cg|OlmbwWֱϽO^͡Dpi4#oSUb6C|{m-lKf"#!-Jj0J Iif6HZ9ʱ4"gar 7\D<`ZHE['(U1)z4)pQ^`!5{;aܽ>Zuf`V2%(oYWϺHL[އ-<DΝ* U'd) dcpOQ]ᅓIYp,=rq1C\mxӽLdv#FvF=䢠L ^2,녜;93}T z YIh2ԪE?A 7S&ԊZu//o_xV6~^P~v0enVwƓaoT.AvmV`M`gjY)г fcvן gNJ^t3ʌ6TX9 s,&B2;kHE>ג)E 0[-u8(<< Grغ N)˿_ˆ/KJ[ H T!SΔ:ۂ \k%|h+}ng0,K*Ljd<)VxH0<)c"M&V8&`? ;Yzyes`[$dGG]+$%S#)F:e Yg$c)p*4L떵^x5hkVv15(9h{ {f0 yJuKU\qXt Xy TjbҹVpb#DNmƫIqb\|K-0A0ɵ rV 3f4]P{tHb)}8ZÚ} L3D;ߓOzC{o!!3ϗ?hj#q[w$͟pZ.oMxvSW>'S:FX1V2cpn[ť|}A[ AHRVa S띱30{-# -%T*j04Jg6 G-pymMMY*?T-+zVh aIC2+@T@00THq%Hv5_PWQHwD@LWTD2'Tfi%SA$ @ѬXy, C;egX+[XZkƸv1NR+ӆ`fBcCveU[Fi#f&2 T" +{$[dZpШfAxTopУKuCַ3޿-.!Yaq?8Nח"/%sy\q C +IX }J%a 9O6Q}Na  =ȣ(Ɍ<=cX2l(UbNhnmt4E4tu^ݛR%n3+: C(XR\1I֓+ ?G*͢'1F燾IE—})eS,$(^LWWDg⥜"#\qwtz ӘUӆ 7)dx5Ral雪d;oN7ՅEpv ,ӋM9TvPSn2.j \ޘlpwn!HZdC_Q0iNu=ٹ7}\.emmuu}ƂY SIs>]jz2+fɱtOq/yֱԨ@0'~.ۙ n`:>^}ݛϏ^Oo&oW| Ο`ƅ3 ?D܅{]Vi0?Z&mUM~rMGʹ mB(%{7۞3*KC_N[&'jwMЊ; t3?}1-8,*Uҹ[_,! 0Ӏ@]Ԯ)h!clԑBa!-cQNhZ )_9%AF=sIMRko@?Rl_Rg1eZY7(u[18M8E!@$: aJ-5 1'Rw t} Q9MsyUs֠, BhДA35!,A.Ca['T#NQO;N}뱆Ēami!.Po$f=S]QUM\zWj@1~q_\Ǹa [u5LudG N" E. n+waMa7r5;~xNr$K4y7*J@ eQ,`,זșP'T .p!mol]6B {$ztw?[jp φ囩T6TγjWw>RL5MGd9t]G&e`aU ni1O3RNwW'zMٛm@g 3ۜCsKɓyHoAsllP:9F1B*U7lWӢnEݚ7ܿjLk.Um5.\˛mkiRZYݸ-!ldzML/-YPNBYRϮzkQ_GDfhv"+U2ƴkGYOe *\@L0.,Y;w|X]b8tLl\:WUq5_gc^:rh &(ϣV>TJ- l0cR\ ];I:v٫!e)vBW`\!JRy*Ws{ BBQrUv s +wS; m猴^>7IViyyBA؁~l!vKQ>g`Ɇ64. 8f}DT;_Lމ >s$OTh*n;puwP_L7@bs@" !)uqrj<\|} C($g*"vt+ ;N*\H1pm"G{ɱ6$Qs ɝR 3=tzN ,{LG+c9҈#lqF;ljq{[i@g6>neIa1(@7U6.297R d0u0(g$2bsI|ͧNj/󥋷hre}LJhn}~ݨan޸a7|p܀:yܸui-+{8e|@,6 ޚvYIsp-+T!eP|zdѩ/ΣG9KK2_$ְٳBDvOZ?ip|uM[>i~(vP{tg6NAf"[ϣO';z~I'}hӬUۇ=6m9B&ѡBWh9ȑ__X3+ oǚ>NGЁhtj:Cϯ?iu5{œ`\4ANN[[>͝ls_tJMʄ|k\Vطxv`T9F$㑅H( *5eDDL ` Xy$RD}4կ`wOZu@0fy)L #nQ&C+6^ELc8v0)&8 n+1r6{7Jn=-/4ĵDI%$@EUgKVsn.-w6ӣitt:h/)A^pO#P0JZʐ3LFu (WK NM O1jv̂,< ̂_ȳu1r, M`v}y ON(8TK1jnMc:P ,uđHls.uXE XEdE"|,s(o" X7*ˢzUF>z蹗c'$rZ$dGG]+$%S#)F:e YձcTkTu~7f4>B8b8pֆ묭\ jÁG !0{ UQkBEǸίq@Ty,![nu '12զj*˂ueU) ZG`V 3f4_P{tHbaְfEB_5SQeY^V}i=i;ė,֐ _ V5m[[w$SfOsD{bEu(E^ds-Zݓ!~"埳a\AD1%Z x~fX# yr |Qߚ8N G^H17š0\tjdWOWb|{}]NY̵qn]7K-^*\/)l<8}_p;%gxnkc9m}V79dJo6%R 9 <)J )9%TJ&j%R%`A%+9R׻'%T9O%T9/8UtJ9]eCXp>͉rfad;+^@8 &aʣϷ='IDIk7<=ʜ1P+,SM~9+;׾/]!  ר]QQ8 ҒPNbXXH&v;im8Oo|zP2zE))V(#t+8>jK ^Ȍ ^`A ƜZ؈͡ծ-k@p|RWT> c0r/_MۿvFv (hd@a2'ZK(I fh/#XHHLBDN&րGa+Ұȅ4*bA2 ('`%{ 4biuձWcZad`(QHLd$&E:łKs`&t  ZjZiԴְ=(E'}*-a__װeEsڢԍrXy.@MAXJ,X)OBVdagقLG*#R"eeRɜRafXTqLhI$ @c z8 סW{lʺkEacl"v8! JYNF<̄dCV"-K;@4Q4r|T{ςAe,RԬ"O VUVOF3놬o?{G\pqpf:]_Z$Kq_B 1 $ccKz))h`\E-\&OSy?,^EEsșLQΰg P¡ ٵbhnml4E,tu^=;ɊN քɥCRJ\qW.U'1c3};)j`fO~+V|"#\Tmt9iИ1Fӡ ÿ`I'ë)Uo1Uog)O/|T\gH`loͷN/v;7R=۹@Z炙qpycz{P]OTvUOuݐn$f^X>"k`Ŵ{݇Df}vn;uSW:d]ucX!^G!Lt$uA|֯9>Y|tէ?qQ3 ؉ K]OwDS]CA>Z&G]M6&\&3ph;n D짻ozΨ"K?ikWIFWZ{>٠SE>*Rn}(!gŅ?Ӹ̕>Ziњxy6<͖̩cQNhZ ''c`~Uw0,YT`#:H)ʝOB{NŔie%fIla3428+Y+,`:uLP_b<+[[v#̵gBGEžQAai pAHB5r5 [o=50ҘL,FЖaiծէ˥)㷷$_lQnd6iEͷ:NUZފL6`∣ta^7Md]In"'` [cI"kCFeJ"vK{NG;UydXDA:$H}DŽ#X; vX#G`/cqTQ܂udXD# -*U &_%4l5Ff*b0>ٴځ#@k=G07'wc~7&`a[XUhj~7tc;0oS:K~YPNBWj !1=Ok:h:o^|\3;J}T" *"YI9} [eM*6UЂpG^wQ+R &(0&wgz9_!egW+C1clwZb"22R$m.ӱAz{;X{+[ 00:K/v)ϕ{Ͷ 0*x/ ݇`A{\Bte`^`Pe@cVי|64Zjf?}NC= }!1wƀ#&2^4uWDnN-ӽ9%gӛ'mך"b"XZYHMXaظ[6-N&G] Xc9&٫̵̍s1Vc)Ͽvz>u#H0!NDž$҈=E\#tJhmPB򬼭(oݛ-]oSZ/[v4X=Plٸh ݋Nӽch?;^ecݴ(s*"g*;8 8h'2/OpyX#])nҦӠ_Fg']^K7H7'=}y*uهK4~7I?Y!xt&r6/Nz!Um" be}ٮVjʈ[b FрG!e ΰ¯Ƃ8e`v֥L|RN}kȽ9knI${OiKs:#j\_25w^m>=ݽLZnnToTo@;{~ӎf {)o=b5)Hq`;0=kK:zRDEv*n.'=-ΗqV~ D18ECU0_~PpeKVLc xJcF1 )`)$$~퇜@\0PTJnLhyTQ+4 \2Vj1T)b+t@HmojRz\TknlLZ+鍴2Y^ݰқ#o*f6%pycx׺d.vͮ%cFGBGBšu9D̏_JI>q |}ht؄%|F 8æ-wOߟ~J:YBYF0(g&lp` K=+t8?J0Vᄅ|P&fhYƂk4um1<'7\6n^׽G巿-jOpfW{we~¹@1K7%{°7 }m ri\Ă[Y_qd#pLssKX^^H@m6W#^mGppKsg %0a84]/%0Lmsa90 C~gI''4tz?_k [oX| Xad g;дw , ќL(UXlS4R. -=&VQGLˈb;μq{;8 h06-5٘k Y/dJzE-s DuCnƌ8 U!`1H^I]DTMjKFۧs% 0@haS ,4b9Gt2IDH8).p ](PHRS=" g;Dc(lP30[ [U AēbHZg5C[\0OI BJ;c,^X!F# J/<Tc40Xk7ajR:% hT Ҁky$aM)bXDX#Sq$$&H낗KkRhA]ђ}V_V ł) #)Bd6}&7*[ 2EYp $+G8F>[c),B QD%i 445 "B S>(ʕ 3B6W>h`$΅uy=&º\X ra].'5:RS+U22ʕ*R%WJ\+U2ii)9]Crא5$w ]Crא5$w ]C2<Tj~$(IQYFڛ/sn:3LQeCPF7cғ3~r؅s3NJncgmI ZrV8Afxo#YRQ|[xk̹Ɲ9:!qpa 4C#A@* CXXn;B2(C^ԪSͥJ7 Z)J-Q/1",dl}0z)#"b1h#2&"rhk4bl+|y:BQBQu1,_vw%tk;/Su^AěCwNc!L7ZSj0(p#*$;A (UtȴMgYo@ǣ4nXTZ\X:‚ 49 1EWi$HrxQxjrhdXRYRYjHB@Lq9.bלsgw3XKqy9P)5!E )ҩgBD0fy)L #niHB\ !0 b.->mG; E N&xmMg7t4Uu}0g&A\[=xfw)G]&Y_B;V|7 .͇'r:h/)A/DX{".FAfX*i)C 2Z$e M O1jvB,<שR5e]im:{MC))8:"Z!봌3H`.Ǩݕ oH(|PL&: dfZ :/'%VV>B"JFc&N:YZJ(&OCw^:Fr D4C K+ !L5)#M0΂DƬcgcFa|mWӁVJnItҁϸ17L~1tiP[c4K!sc1iFxhfPQGuBqXt7-,bX0taECg#^kel%9҈r~\W|yjC7KE0n-`¸{ QY qC{zk )]Hvp>V9+B_Da'INRlr`D|>mn]VnuȐ@5jxT~8!,$S{+%R 9.Z|Oxzy 㝙Ӛ %Zʭ brK%bd&uzÂ93]{+!kמڵa$B6%yng򔸦H$7xX$%e:Xʨ/Fysj iv;Q;Jh%Fs1O D[c#TNxQ K8.J{j%!TPBhLE$ѐ֐5:J&*ޔtcw3BOz˜TPtpםnZbћq9[Q~g4!ec^Y_1a\WSUQ{q5e8Œ3_'T3S$2sNɹSOp/]pIrrm]gt@Jޠ Ƽ!R(OU LϪm,kQ&T?F׾ 1*Rw)~-l5LF GoU3ק˫JN`B\o գD5&G׸V\LJUcLŸy3U񕟫 oG΃(|w|itz9-7wc;(BUv=Lvc '^=]uQyeyb_Q4mP1{ݯӁٹ=^Y!z{VC%c^96;>|T)=E7J榹v:w.Pǯ޽w7g?y2}Ƿg~_pFgZ̓EIpz]kTߴk-6ɧ 6&惋̬ʡZ߮酟:1#)IUN]y"LF3SlvY%ߖʕfRPM!1@]\]C̶1c e̖TO$y2f`PY]b!~VD[hͅW ~GJo֩sNQ$u^Q "Eo|I’'0pNe R:uL2?1x_ 6褤 cY hI3FhnvX!)ڀ2m$0C#B`lZ]C=x|9^FݍXĀ~8mt?\rqX. sE,b,q$W'=q`Zpԋ>mc:zR'?#:Irɧ{2΢Mx9Dͼڂyp _ٝ?klFfXՋ` }}$5@v}^O˛]$Iԕ̖RS4" ) } 7K9:V*-!0"rNX OEx1x0iEin z|" >P1@Q\!jBH(Ʉe6GbiiԘ8AΉqǖ0n)-w7 +&%_ ٧?ERq &|SV9OEpL>P \ ^-)k5JIb(MH逹R9! meGTةӨt^V;?Ѿ~@廃EnퟟŪ2쁳y}y Fa2:y E)2DLHڅ* &S>A,h5T"=&b+5OT}8o GN'J[Ϊ]MKOu=>Nc4nnv5e9/q؎oniaZ%0''>߭e0[3Y>k$,'s7fwO:UAfbbI43=)^3@V{CͰMx'wEc5zLyyگk:.Ǔ[z?l8h {/w]tŀ27<^Hff+n, HI%Kk-9^$m}›"XK"J-ZD)kgKuX(/*^jΕT%:9b'!@Ϗ;iB04"2ip*F!VZ5  4q0;c/xI#u5 g.9L8P`peqy.~( 9n-7gss '@ C9\uIW&Y (ךY +ZeKtS^`nT?p,vW`R\2X6.@ېyϗîjLi/$QԺtRU1gγ4,q⧇0%;pna*zä0ŇIiqr->+8o<WNnxZy9'pW>=B]4MKȉM W]^ ~a"](L C\m-:t#Z&SD <}s f@#a`Op`Nzc q`ϕCZA=#`A MM(nw;zB4:͒t<>iZc<%I%1$q @ sP)c$pP,'Ɯ6V;E!2aJ24C4l+e>i-x,rK5`&!Oe4qި$ePBJFMTt CӸTc :Y\Tti2_NqiU6MuFl4vU)F9ec ,a^&KGT 'ڐ<1ĠRy rVym;\$S#AX˶o8{1sBkuh+g/ŁƨxuWznp^Ra>ߍgNK8|Uߕ}OcRM>_JJ-3/ePƀNDhFeC: u1֕v* =fue[n:C*}g&jXJV4`Hi#헟^oU0ZPs`@=jԿWC˃WxTII~:qX\}࣬89>ߝSɨdQ+J_᏷˒߇ILjc-In_֓?ԿMMyNkV%ݥN4v+ v_{F+f#f47/" :8y,6=t5%{ȉIP';a=QpqzTB y<y"hai)<1Δ <)SpmWgO#=U6JP]F <XC#w&@h)Q34sV ٪f6k 8eUOuV&-ʾ/'yDzpXNwMmd+|ƍ 6򺋗OJ4k%:?{ȍ0/I6aɞ ]VF-%YjْsEɪbUU_FNaMHfcl.}K:-QTE4;kwl%rHctQlDh|dLG |-(C XN&ye`Qbhnj"Z>o-G RC^$5a3m5V*-xZb6SJnh܅oz4mqJ7.0]K$!ϗynoJĊJÒ N&A%dZJ' .A+LZaJ4`3b~ِ a4j8cL2`҄0n<vGz#ڀ&D= 3\c^z:C!>Vb /3?{B\ :燊ezw 6Q{D|jK {"32x5 ؀Fr]ltm)]CgwI&xaœ'x!3tmF&x)z yIat$LEt!"'U:`@˾! \H☁Q )f`:$ rZBXG iCž:d ɘVOtmPG!1zT4(%LR#1s) kvIZ'iEMO"{bW֝`w7^A)je1OJyr6/tܚ6Y0s29ʍO9Zg$EZzm<u Xk nBÛ/V֮z \rQJmƌ6>fYc2kc=/6`Z%M U"[ =ad;fU^.Ra7&h$lvǖ 3Qn28?7t8! ;;`jc@(kܕoVx{RVzU@4Jg6!ʣYl8 b&z#ul5;+Cy2QԍQ6gEi D <\ & ÔR!q+A7B.HwD@LL*"Q* J3 \-`@ /76")g fCS^q|v;>Q5c+t>Lk;߂.W `c4$:ZR90y{4^z?_'?>n.Sb$,bo͟aoz1@,1C'{m4w#ic7uf pQqK)࡚mnp5J^A64V!^)'T2:>M\jnaJ=9g7*}Ԩ@ko/~ߟ?^tW]\}#: ̊F$rtZ?Z6G]N66SMfo|oާ??ih~QLiV)kkA| 8O{Uʜ/SEk@9Uh 's Xu+#>Z5QG͖̩lc:yIV)~1ac~t0md *0o{Nj]c~b7FWl}NŔie%f['MxӄS\ DCfRPJ휺T9EcyJWQ ZASJ)(QAai pA HBr5 ۸o=70O< -" ;_ͧ˅)㷷$Tvף#7V8Xtt NaAaOmU(Rq 9 M$7iQ3fͬ$C!tȰ^ KG{=qMmJS]/ 1JaxL!Iazc`5۪q2&`kG E-XZbE&aRE \"IQkl61L!x6oȸFp }sre&h[/sg&ؙ>d` z^"VZuhuy_!zER }0Or~m >[рmX04,$/h;mF^g#/況1 鴁r1FA4((jҕ.|R%T8 ySI KI 1Y,-,$2kgk>|5+}~>o >\9/!?.sۗ_rkdzFZyQuN{1$x-F("ڠ"d̾r4UL+n"tू0<ƾ1I7fg5wb|~̶qe``PHDef@c3/ &-%Q~]]٭($dyhK?7뤍<ָ \@y3VԴ;q88i!6Wvz+:mV-oGXh~c&홮?IRKrJ`N&?+O&?Qű' .+LgZKMֳ](uYYa}s54PרbQn3fϹ(_U9%hhxFj'Pt,Say^ Kb|ߛ~S0YyF٨uKMCއ}E<,ְ祃 (.? Ƙcl2ĽWԊuܻ>̫Y H(򙖜eN)bxЄ;rDzbR$-gXڄ6zޙq?pߵĬ8<1g/^~ywFҾ MB7~P lCwT=㳋=?q/NF53ktE>h4O@jM;!n+wsvޚ}6I|&A9R+K<.b}ͅp2f`f Et"%dl,q sg޿Lم;`9NnYqNJ Gs1sdN]\a`KEpHk G-pM5մ5Xe ObL}P䳹haրT+%Mp `* 8Ly*R$ lK\k]W_CՄu׊XÄ0Ƶ"v8! JY6x| )NfC_Sۚ ~TZHN$ZFaӐa#QD"{ςAa,R\*^ R tbkk:p%xbtn7s 26!B|j!\7uF$&};r 9a:6.|~G(韟^EE첷 mOS4a@'!L9k)(ԝ4tJ, tV>UÀ>95iatH\)`BJ\ytcq9_F'@<,zc+r`f?T?QFm^FBS &ntY\/@'wW&W Sޣ1o5D1Z<)nvS~pQ7Ξ#aI|k}S-ՋՆ"^_ܘ|ra!m=6uC:|Aa8u`Ŵj>޵5q뿂ҋ*4QI]䜊)Śh4ӳ;H.Iޟl{8wed]gv׶g5T16k3y&#ϣ:"x3Fz9c}I{&;ˋ*eQ";~__{w_?^|epOo.=:_Lǃe+ $Qp?խ~[#qkú|s_[{W|mZ8g7ϕ7gS𜔾85Nvza+H6|RdI]RinTA QYrgE めSC0:G)Ca=76tI$Q>1LONxzagGZfʇAG.5'ߵn=8XÞ͇`>G>!!eH{^!eH{^!eHso@2g RM'$WJu.=Z%pB*5 ᄘ/6n'/VkE,T%R9\YMrn, *g.˯{+o74X8[[eeDxӐ{y@A3P6 Gfn  qxI)PG~Ⴉ+ș,ZB 6UUq;"~WzYKD'՜,"7EnʄeӁdzE. 5.A2`<`e˺H!w4 ETTN5Be^tw ܓNI˭$,-E(WƑUx/$MnpGsr|} Edy6xǚIg puc*b&g-4j7{KUh4ewOJvѿV{ߛJaj꾯Vb_Ȳ&BS/d15s{9Le< oB _ zSB;cMp"?/~z~틟sM/oǏ҇+MukTRx@&˝D%p=`l6qM.)x ^CCٰ݊-fZ0\/mqUo|CQ)-K Y ع*CTa ?SBD*ι526*<B?cP)TOHzHMw>SΟrnms| ^M4 {Qr~t3bo0л ^[yBdV7ւè^}n~6[ng뻙gO-Ϫ{}0>TyΜwtЮOozp1-ۯ007* G+o IEU0W*[q'" '|kr}rQv±h>fRjey<]3%S(MFH7*:D̥9WtYCRV<1İ$OR ‹jr}5"Q-wJ,Jp*<2ZYjx)ըX|#Vc-˨} !eid'@+9&ӷx>%^K~vGO'dw㿳vÃLA)"BJ%<봻0nCӯgc$wۊwoQt޼|j>F7q$LP)|>%E%1fTv\o@ Qs LjMƷ&CK24)rA!At,+.)r:B@֛DCI6@7:))d*W頺$鐒n{&|CjB̐q, Xy )cΣ G9 t6MuFӢrU3F5<%c -a^%KY mH\Dk<BV{F`Tdd,#^*{޲Fh; mewgPqh\Q !5?LaRC ~54JK XCc!Jv<YJ(h:H^HS}1dU"US9Ld 6k_M+u]ԺysNQ - Dϥh*DFL!LDH<0*ɍABN馁]{u*s)Ϥ9|^_fXBfZTc۶\ _,#Y -kWgS(9c0@*g iR*8-qPQZ`!5.xdi *gVKSggʠG4Vc1*p@M ʑ \N̎%щ"69k5!Lu=n~ 纄%0WQ}ǹHBF[ۼ{,8޴E rt2 E|ǎ-'c%y߃y@ʼnko;;BrL3 %)z uHƱ`#$AQ-RXHwq*>Hמҵ`dj3I͗{Dǟ~ j7Tv0I Ivm`;8z+pJL 00Q|LkBESIXJ[@RS*z5 S+xg- !~`]H286jk;N"cEƞ(-ƜcҔ?$+3i|&ٙ2&u%ǒ$ߏzX%% ~H.,E֏RV$9–TGwVQ8E:b[hZѴiOi{aO-VnŦ(`҅5 <]eB?` =0r%[=@j9ʁRiv;j-ApN \o-pBZz \oє[D \o-pZ-p/!{]/ÂY/ e,_ ٣B$Xr7*\؛BL0^TB4[z \o-p[z \o-p[J2VzK^zP[@y (o-w?@yR1U=̕.eKw:b,ȫ + p* 0\ցbe p.:Vt M+V4 0\-6,`DjD(YQ&@?[ 2$ÄPFyN3TsFf'?ױm3{̏Scc:.V H:<G2Kf ^yzerMPf$A(!@ASTNE^(a:p4 n0w.BeLȔŠ4SUR oHnF%Y0d-=qTE<8:xrme͜(x\PcY/Qk'3-@P#ؘѨY+_Jw T9YDI0,9h4JC)/ !@4&J"Ql whHQv1m=ù6&KgXw_3;.0&h.;0ja}h.8 hGL"JR{` }ee[9{0ߣnoLi_hyژeq`2 2P@ޜ"=vpNNr`xC\b&?uF Ol̥Byt\NRoȐ?n6h{'YboɧT~r}߆*[uSw9~m}5k|f%:::ξTsb pM $1N"~D[q37T#9>Uss_].یg];7o_x3&F#nt[nnx;Qr؀9)s Y5_=#F T{fBf.fKuM=sܨ=QY?dF]5WCD]3 gDG o}9+UyGЪ?JCZM%YsS2fwOQwo._|ǔ7shuFQ W`6&6!пCk8dh{ mX3>6bFf:nvԶ6 ˯{mg0OԸ7ٟtwӕ^#H6$3&HUnC6ʟf C<X/.gROl6F&Ϧ}N*JinIq⌾Z?& vǫc?'p gmEx u\xe/+7/}w?_/6sNQXLyE=D0$%O0Xa2@uϽ;2YFտ~$Pa  1 4i xB.+$iAmnOQ$od0W;XM~wqmҮIfh#\g>@K;M#86)I5 F`Hͬk52 @N%eL I$xѹ-;sN" %3Q=$v OSޘ1AY$T&ZRKv\XED`̎71ܿ#Yn&UOurE[5b*Eggbx od?;&D0R bB/Å& &SƜ`XGhikdt;n}D<2N[s'< 4_z^z$"`̬eEo<ٖ& 9\7?a~WxwkT jrϳ]9f.̢E1 uzue*JJD#iHzav)Զ掻@0DdmXN/} |8,.r [e[B5@x$u=`B}1Ial _uxhz3&Oyw̵Qlۚ&x<`sNҌDl쨒pṩ-(?5(Sޖ6NZ}zXpA& & i,wF'sw%&`0M@SiM@0%LdQpncME-M;Vm9^Q)#BKɠ- B@[-oz}}y6֟ݕB=)DUKWZ Fj+U^eeٛ *Ǻ;?핣d TkՎFx:E DK V\b /|GS79F١IY4ƱxG[rW#L{Ul_P|2Z:RIYAy(>Ycg#:.j`E[,ӦƠ"H#« UͯE84 7śКʊ#J8;pĈӏ7McOzh 3Ʊ֔Nm_a#e9k$!9nS @v>U&{il MrkR OGhWժ̕PLKiiƋ]vCcJ]l Bu}{ˁ542YV=ѳ׿(Zɴߚp~=,w-~tOU3dڹ-b D^;ƥ$M`Q l/C #&4UYt~/nw2nMD*hrBqVMj,5VϝskM#qYϓwڡw6F£Y6:>:21-=tڌ{QȨEb n9e%goZqO3qs$"}ߜIyAm”w7;I5 %;Ą: GkՆR WaTDp~IHN*i'|sA&-O18Q&+1O"sޞ]vq=m-ʶtKǺ׽qoά,m dɐ%\h swJIbMH퀹Z9! QG 0JkAm0.( ^!? j,0VnB)%$AhcI.Y!Yb$ Ba'H'f,9bRȵZ3#8DBF%W >h4xKoBPpg ]KP]QSM-uhrp 7Ŝ> YfY1$Pm$G$n~Waln"@CC 2cԒl_!␔8ImXf5=U.DxÛ2"íU֜Ğ[ d+eDT0n<5  rˠFn&+ B TVD-XKbZ,=|KVZZ,Q%jD-XKbZ,Q%jD-wY";(`&jeKbSZP.9KZAB a7{8ⰚNxp>PPF+ *"DJОpf"l&Bə DKf㿳f,WD 8*%fk%6!װ$J)އ8WB1kW%+(s8a(2BZI=n#Z(ɂ6F{g{wq8u=ѵ~~' u UkRfzDݍy|]lDD3~` @%S}JH;@ܼJQ 8AZR3 @m1n16^8Ë9QȤ04ChVZgG+SPP j…֛DCI'M72z+#$AHH͚&.x& ufwa,X y1#Nx8<4ӐM`D :#iAbʝ#N; -a^%KYNKqDABF>4B(܂l Bc'Dcqxۦw콘+ދ&i+{/6H eK5=,͠n>4MtB$/u݂UP(PU"US9Ld >Eo%#C͙*4Q! BDNJx"In JA;]DbDzLZ^.>m"{ 1ݶN"I ֆg7G *e8mReUL@ WRJÉ8(jqR= iJ^ Nu)O!UhcT21MHY"!H%Jj[#BTίр8o!?>I ~r(׆ۇ"X6r`4{kg]Ä`FxSוj̑cvSxzN n@nm4_bmMnm6OgG)x@+]j2zіIc6nn GVSl}Md'0ݬ =>.8nXWL< Lk:Z|sԜ=l设jyk&ZL^#d꜒ͭQTp<*.u0^Tmnnna|#vwbvs~7px}4:z}7 <ӫrXfZ 8aM2w0 l <irNɧSz~wDbJic!I&BUJE #9I*XK(8հor=M*x @ElĞ&4xORڞ6Z\N+wk'ҐBHΑS\!zchFZ[JH5޷q¿P׎Եez&7wF~ZjOT|uGj5O!¾ pg/p^ Df$6 QD<6x^@9e%z%(`aJ G@b x"^YD IQ&`m Vh1)!  1GLH.8(Gq 9 k01J+V(퐔vvh1, gU^-lTF{6;3K -WPQ*vÊZh[ 0{h]lhD!v49؁"^K \p#AN2Ί7Qx9LĘVS&YfSi@7hrS ePVJBXA刋ٮN %oMvD.<ϲx{s"CyD${o?SIaC6efR`Q)k?Ly$L_['THI[Q%E%,~ xNsfԓo t_]ǟ~>&)Oqr9eꈣV5>0bQn?w^+|祤Tݜ>lx-K3;<|U9Hn2:HHD4Q9DJ"A-iD)Dܻ"#C]-1lW3P4!T&PԐ+G-BT\f X _㤴8J',M;0YVi<2Ԙg+tI i+X QؘZ!QпT?g1CVQJ Bј,ؒHyO!%䍵Ƙgmz3ps-X\(`L-.v]&js٦-"q|}1aDNcPebZC:&ٝB ߞY'SgEQv Q)p.n5()v47p1#=SM׹@JޠƜ))q=ܐ뫌E?S>wׁ 1tl֖/skkF2%gM9?^doyCZq]g۟on/4ěWrJ7M˰e7.,Cj;w1~aѣ57ӢqUF6:{ȦUP|Qh%_C!Jl3<<"&U;۟gD?׏W?/?O?L_}󇫏pۍ A^>ғ6fK#~am>twYW >% ڏwIqY΢Mb C=Bvس,ϑD.3drgp>LnVSvJJ&;6JԖΐ˵W]9XG1 T6tUOɞJ!#43$*&(8*EZeZ߶8u6ijuY j3g+-斋=|z׈}.-R1$3ݫ,8OepLBL(I,+LRZ\ d!VwlEo4u@C%![" 3x-#̒;oo9Ms QsJpm,"DK*`"'h3j^hJ{|v.&XN^\_gXY_&'󗐠]p9t.("IlI&dȇyPJpC68.W0\0r7Ey:剶(ZD=C^m1'^ٸQ_9 rs z{sqDUm?EYen ݜ$r.B i0ѿ?VN[?[MZ8b4SsMDx.:BA](s e,y<\Ѭ 6 +h&j\^>Pzӫc'ۀz֑ڶuāydKm<.t)F5Dž˝[1Nn@VGƐYҤagapGGР 2cŽq0z8?2= n)?3 Ҵ,YanpL؍x3樔Mp>{aBx`=ųq]Qi@!b$rV (oFሷ%I w@]\) ,xL;s+9 YFŢI)1ᔎ` DHpV3)Yj 7^Tdi9m"K^ߢtO+Bz]B7Ås3-i}ȕV>xT>xVT>xW k=3(Gzp/!FM: TqVي L2ڒt| }@p@n_"X/40)AYcΣzI 'Jܾ#Nw@k\rjCRZjmp|侶noC7ͺ1cehG҈ g_yB%_Iy=N-HBX+!\ԶrJHeNY>KuVH%PsBJ'G,$&GAiQ"ZxDݪkM'ޕ%Eȧya?fƤ 좱x,)'3}::)[_ IQbWY4a#BO^$sQKʶ Z#g)}jF"$.|-'c|;cFG0шoH>D~Af`$ä5qm #p5RGTPJm|,҂dFF #GtهKP(Ho=O s]CCǀ/H`*7UI3/z6tz|!;.yg>7ɥ&?^}n~ ۂ7a%ɱE(c._Vv]\6#WBZ Z˯aWSeZ*3B\zeFR\+3Ҹ{98i6i6i6 IN1I&hf]9FI<%?O81qP'P_kQ=A & 3}O TU8SdvM1)^sR;S;yLJTX5%O&=D;U+h'S*xIGn98>HYo]ӹ+/ݿKG?ᨼLJ>1~mRu5=ͫn2t[^s&V5+~{;g]XⰨ(݇þ=L߸Q6 |m; ger_T+n(S4%^heZݲTȖ,JlX-Qa)fJa~4Ƹ䖫QǗyݡS];x; XLn,x|ꥥ[@›8%nYfnl,5>]mn~WӴwv+~u#be <&Tډ ]NcqvhZ!trQ;Pe/AyA!Yi;3|YM m;Lw z,g+440ďY׀'Lu{=3QL޸4~oY:>Dc7<_g^VHKk+gυ}B~r75?\qK-k#f^kn0YLYwrZ^\}VwFL8 DO49MAiK/=y."YlwXl/<[M2{8mIp@Q mCz$kEPK& P^2Jr/wK2!"s+t=C^J2z*QU6W\µB _Q.,r1gr۩7(YTOϛUA[D;O>;F߃.0JAD&ŗlWd/۲҉myNԪlߑVT`Ӥ+2*_{gl{-9mɧFU};l3#WIsD8Ѱ_֞KA~n*֘" K軤upKw8M%X>m47*EtO;kH)EВ)E 0O[!^HxA/8PvZի*EǸ.q2\sYzO#\|y LqY,CjfNc$5<^|9Ǫwb"X_O] TLbvP1QIr=BE1W\\%jś+b{4WZcxue|@P:ӯ/ٴ&h i_nq<%&;J҄zSJk, 2`M곍bNy- 3=R ` I?C`8gt9a@UiGWazJ6E3511*7 /㇟Yz,!\)חeWMTޣS}M z֠jPj5j$W GscJ_҂#|=D.b\^JTJ՛1WdWOnEM Νqa W,NXP֯rOy'ܜ@j܁V]߰xzVLw)0}c(VW+Cu!aWZ8|HA28+πjyS` W"R^H*q-._HTE5|k|0){M\SWQA)3ZXd"y8ܽTGOg1r}~Q9/r*(HoQ J3=D.Qk.9ՙB].z}X79YeLySY~wV36B zt)6tzWPuH䦂t4 |A[_}n~_ZhN_]얆WK8=6z:w+x6ѪMv'fWN6==㋌w_{Qk#S=دUێ5G4~ۦғb`-F0R>x ,(0F < 6@_G8aՈ~#sy}_krl{-ZC`uX$\ QӈqRI2IDrY(Pʍ"VFudXDc䰅T))$4¼W&g\nW?1,HUC+ !+jՁeMRU)]}3ܨy/DܷLYkrD`<ꪥh;3xfVnRwK7"O i 1_ bbF?zՏQ}+e$iWHNV~8ld ӏ93"g9߸h?,J3.|%\H)ոuHP+ilL)'R?PyR86FES9-Ӓt(jtT(`203(s#^*3NB0io5-I rH# be}0z)#"b1h#2&"rZ5rv%Mk\1 x#IIܟ)]>lmL6qh~iP]3y/ϭe=?TQPOB M\ro8y 3kPTNc!3Z`&PfTetaD`p'55rvkF8UdžŹկeU!(c-qE,_+0:MxCM On۸rPAp5(4 mX|AYXaM[l9`x> #7*D؀zg-N[ɕ#9ѥXJG!j x$ )ib'@6>D佖CMFShB5r\el[ӟ~;Vsv\=J Ƴlxˊ坋Z=vqփ᤼eVVY:=7[IV[14)~t/j|d_S@T |tuQ]xMlW,oWpe1okw^\Pk;׮.i+Nfggϼrox/T_q@/=hv o ׼NoNIUYrT-rˉIR,Bg͍BF+Dy}{g5C[\0OI B.UiB=!{w>wn8Lh mg 2Q7M{wB& 3`B-XʁM5k`58R̓+r8[VD[G-yk4|oz rߝBT "HFRPaSnT-9AN j gXRDg' qfAo$*dZ"\)٭>CNH(6*:b+j'Z Om/;xxY ʻ7PPHqcuJQBRUII*"bիw; [DoTc18;/:Ԟ[18x-K3[O*5@n<5shP `A+ 0Ly*R$ .yPWk1HwD@LǙTD2'Tfi%SƀBD` #_c]0 u^d&֭zU)b cj*(e޴a#LHڌvH#ѣAп/A|l%MeT"ڲzY:C@lEjA.CR|H50c;W~=ׂBdvI>5#rfSU&0 I?ƉUazc\YLUӫ1#iN N_|':;;]ߜ.~TR5c+ZwFm @c4$q 2]OU sX,]ÓYY?zvv|nQ9{Wz`l?GYUݥwF?0V+1J]ːe$\fY>!0W< N0bK{5׃ٺ)ֹ*AWm7ٵj^5WkHZcŸGyҤTN^O6Տ 7to?8?O??=OxsL;[:|v ?|wlt]$+Ұ^4KkśoʹuA;psZ_n?u_6JfR >ˌ|ۯ@uhK+ U/8](!vSuSB_1H.3.WuN ۩&M̰:mo{x]cПAF=`sIM鬟iJ}FŔie%f)mJG؂342+Y+,`u/isYcy)ƫ(sX@Р)QgT#jCXe6\@RPY.av{s,IUpbyLIU7%|UԷ~G:`0v1N{ ,¹;gu%aNZ2li%% ,}4]zV-g,Q_X%!FI0 җ@cm^+3a"fU@_t 6ȹ [T!((9lV9\"If(8U.tL}nEnflQtM )KGfVU>E,G?-&_9>DhT .J --%L߂WRK+AŔ `!` wȐ1 xTWE$ZI1xLKp3"J}KI N 1Y,-,$5-˭eg枛OOt_7,㋋$/VzVd% %ei2z[Lӏ@y5C>V0B0#`G"Ujʈ)FQ4`pH0uV+Ujg8->*jN^GȻnx|u`xYO7=%%I'UvlMon zܮ Lli͍!J}i^KeesŲ)NLX=Q0QD˂^9NzHͭ6fa`:'IqAt¬rV\={n@5y #HJ,u-Z8AWk jͳðtÜcn&=t j;7;Ocv̘b55SCeB8nx2Uu"vk_j_Qu$zsA;P-j8x39N2FTѓX6r໫RhinM?vG]ͷvK cw:\Lz`ϓ3=/gvQzA9QeC @m!{UPBx'mK9}& >^*bsxyQ?\|,-Ƶ[6Ίw z7fV ~!Sۻꁤ!C^bR"Q`tF\0o?cR(]%\9*c@ Hұ4š2D٦Xk/}s,͡=;s,7*(zg-N[ɕ#9qF5EfLGU*Ca kIDk1hDk45[̞llYTuPljAP6TX(:cھP[L!""7LFnO=UsH7:a)AWޘnURw1sP4GgLD.64oթ{盈;}$*@@bDi#D(򈰈iр-9AN ێ+0$A}bD <* s~]`FʩR Bevfg3IT6ɪKkO}oUlʕ77*@y((28Dt ¦CQRJj(/gJF( | .#H%`Ffd j0 !.7gfθ^K9/̆;\,Ppmwdۘ~p} Ob^c[cQTAn1S)/9`Z,1fR0tVM !D%$ItY$R:"*3fm#`\ ΆGµw`U PZ☁ Jf`:$$t>c}9w4&ꁳUzil%y\ad`zHLd9DXҠ00$f.%栱뛞zJ)m)lD1^U\Ȱu1+}؁$G%;f,c5)cV:/̈́n\5<YUɄA BRJ5U8=$]&;L.);o(4֌rehvJ0WE$8 &Sd:7 # +6^Ԗi ERcib ̞llq7pྫྷoysT3@W{߅' zx uePEy|5y|mnAKJE!Us& ɽx.FAP(TRA pe$\.%6!`BL[c*5颔Rk Y/5^#2J6p {r< Qfq:bu#D)Big<8aK1jnJe_PlYhj-A RYU4) Xb: Q25 DyqA3]UQNqis`R,2Q| IaʨNiq$ 24vD46 .]mw:7B;]Vjqy3ƜLr߸LBG`X>o~S3vyΌRU^b>QwbH2pyJ"^ͼ*_@qiîWt0jpsj h-jU2l$ q ~?V?T\E}yu<ڡM*iQJwfѴ&3XxOwmI e#}Y;5 ,Hl9gI8HkXYikzzw=E ΪỾR{^VYoxZ(ڜ-rm|ܔ&l`2~U]zÒ0x=rM<>pT<=*i)XL . WFKJNVC:f3h덨UlڑnF `Pi"h^}eLR%#2E8rg>zY^T1kP }@m0jt|R{<^p;U=s([?D)/ i~QC2KnN͑ͭ$G~Z;2 4,o6JndwgQP+N"0( &R1 Hr kc;PԋI9DŽRn&x92JYȢ`;X#J?B"ՁY-O-' iW0ȫŖ4Cz_\d}򲕭G7Mi l[s}^ܫMg[GETls"gRbr[7N&NٕDv9A`ʃ6FH%'Znld(" xtX[$Z![E|A'RiǖBtnUlQ1SLc#_ڑo|w> +H0!"4bOS ^KcPDk7[V wR=M]БarȊ$o vb'a7 ZwRZx' 4MfdrDzB|7jQX=?lt#@\rF"-#6f^z{}zp?$NwIYӸv'+TSz[ٽC>gmO?&^Ip{Ř^FcvHH!IM-a$EV -:#ir̕-2ly2B\ [XsJ:曼i_} ]M#7{)7[ =@nodV@3'![.@랧[SD F[LQI&M]e79G2lsfgH:ybjbB3 ^zav)]EԃLjnsˮ&pۼ>5Y:t\jxḮM8/۫e-X}>+1^>=zð*d1{\Z7ͷ-TX@ը콢`_OxoLMT@=Ƕ]2W_`ީU0։:z5YW,73(Jk@x: āѤZy tW@0pT*8%H6 S <:R")$Hd æ2*`d 8uSAP$@2:i=TK-C'!=ZkƸvPCĎ'PA)˕ #f#VFa'a{n߰MeT"{$[dZRFǤk{Siy%vȺv DŽYF\?aP RL1ED ~)`AB $!V10p.Ӡ$ k/S ~et=TyoGES6L~ aXf PC_s)(~?\qJ8u^0ΊN)%* H ˥CRJ\ rgE0s~(t~ƫk+̭eͮMHU=U}_^Wrb nQ0=hhyLHݹR0)zjtVOaFVySi8.t#ޭv\[v"Py7LstƋSТ?բii$]8$|A-t`ZdɻhwsZeY Z>!ͺY5gFH+ ݆}_1K|nҗ K嗵:̵/~~ǟ7g? u> `a8^Hޘ6"@nBynkjo45SkRSW μG^0}#3жضDO/qGzE'%f )d:> W% W+ 9KNܕ/Tqp*C4t4M<oP>%w*ƘaNI2#gti ~(1Cd *0ꑇ:H)ʝDž_^(bʴ3o Q`$`18M8̥VlVVjYH7EGF9/D>ƫ(sE!4hqT ՈF٠ !0ȀM$^W'u^dy^a}6lOM[q,VFF>oVՑ3)8R(NBUjt>셜+[xS土bg5$l6J!۠LnL )&YVia@\0PTJnLhSBGbss[Ti1F: lp6HF7g)O}^s%*[wZ9kHE|> ,h;n=OݨG86;_ 4OD>wT5;KyVڥ"IE-nϧ7nU QY⟆+oL0RһpnF0>@/Rp}?}oQFT*4i eQ,͒@rVػ)r~韅3t^T\( B2o4L;{$:-[HE/ˋQ LA*mw\/ӧE8)+euC{x&캸  {~|_٫o԰," Ά-N3;IMྭTP9?ƻقֿT",3vnE5j"ma;)&H2@0` r"WC1ȉZ- A~Y"N&W1))/d?|_~|yWMrJEd#fHaeg{7+]V}n{ :毉zHLک}c$N&q"dl;Q1ߩF=zguG5Eut,*({n P8/J8/Q+վɯ=pLdϝ(,[5t6c cʩXlћbsAޕWѺ$G6Ы2/U~b!=/&x [ZH~H:`ƕrbJ$j wW"QIIJ]8᫽YGH(򹖜N)bxЄ8GK Q(Mi˳(m٨e3e}`, LiR=KTW]eiuoUe ΔJ 9ќ'h[[l]H)3DޞF: {Ĵ(;ȹ kA x:?ˆ^FcD2Y+냉KM1aH0<)c"5Gj#g=k\~~Ny`O{]DT'jDVMwSF|QYuzlZARKyKs$NwpxYӸv'{1-/-+gmO?Fz7v;5t)+RT*vyl"u>햝GUpuDUDG tN]Cwxz7yӮ7. ̄.+}]ܵG5@>dg`|Ȳq-ddK%ʹYdU㛎oNoĥTL$А+gfg+gRX\rT@uj=+7ydA;[ˇF=_o_+ln}k kNޭlKxs;<l3pfǘc1W,C[Ȍ13œcgkyycjY/ݽJl;y+Fԫjsm' " %d\CϛUҌ;)wJRnIu|; l8o8Hq I>S)Ը LtX qohTG:nn[\~p#-(Lχp< >}yϡ[g˘wbjԓe릉=ㆎfæWf(t׮tf{h_]nx^(gE]p[qR7 a~Ǽ XPH;捑s𳈒ERbऎ `+mLra:i{ PXrAո u{k Yԝ(bXPm:yYu\!9WhR)Bϡ]fЦ ̠W@%к󓢱Vu%o>,~ZZΛ|IȤEv9 DBSTN"EZtJ-hD-AAs` k/T~p_DW꬘ܓ?)%x4SUR oBG!J.2(c'AkYPAqn/YR9 2Pj(PcY/tZ;7m>Cdǡﶦښ^H "ÒҞZI}H21O!@4&J"Ql whHHeiQw"bBG>ޭ+?kBcRAqsAZbњ!C㫊)"JޟePfwS=.ٝՕBC?YG'bLfw}YpA5ٗ #WJrNuoSM~kI\Y4r\HtByt\ORGĮW*q&Rq>;?C ,\~;,~7k^#޽{ M/U< Mr} Y1mǽ8J5+ҘO&ٸ9csE91 _<4_]O斛]pR|?nAzmw@H[O'{i놵wxk72bFF Y;p8MipZ{edsA6VC%c^3 \>}+r^$BLF%Ш5뤱prw׸O>o??}ÇOOӇ]O8b$I^}x]kYko5tmXQO'%Cn9&31h~6 ? ><M:ȟvI>^}F6ĆuOh*U'7!u69~P46Fi39:[V۔z4 (1SF<& |?1ԟQGE Zsaiu֏l}_K6`S$IWgH&r + SF hΨ{#L>W qO"Pa  1 4i xB.+$1Ltz1r+GRC8iI/U>#hHXFPҳ6jMS}hp0g\Z}+$G*Eqflo)J bo7Sp+GJ%FDv R,ģSTVT6c$+ijc!/W)Y[kVbe.W6{z}CL r=>P+'x$,9Tc";޾h[KGޮ7- [5KZ$׿]7HP2`l23RKְSG*%e]7ϘY*Pj;Bsg-_q{0DPD*ιYRuBgiϸސYzufڅ4.:d9wkM'6DB-ME ?ygLD'-hυȬE 5Kh:˼X:>C 4[mVYC:{c|.(Cl0q)]ɻuyx0ldסq7/סqyZ]7oPuk64yo[lÅv2|v Q/ zR8uwC vA*8^Y+mOD$mRzb^NzN3v;R>ǴbB(iy.=ȌL} 9Cf| y;dƴ@Vj<LP>W%Y(ךgXt_Hy8  㛀aU],< wѲv,@ `믑a*On=8~2s}kIbŌpi]9]E}*23:xƉ3H~9(G1%T|U\D"TR3$'^(8V+SbPY1QL"i͟BS!(t" EDHRlgx5jڸQW(;;<:ejs" ǣr:D-Z|}k7~--e鶞-KRy{#GpܷJZ 3I z&-eArX6YAJFU !9Gt>C8e(z^ q,؈L$R*ad,FvɸYK9,,2vs9R`xr al?>~4R~'Tv0}W.u$r.m7f.,/ICegT.{#<NCl`;8bK%8 A1TBshF&J.QÂqy*R8=mڝۆ043 Ť &R$I}d:&mQyDiA-d )yT(kB$^ȴ8 օ$hTGeKp]6F"]%"-,i';XBXM l85I*ʃpeTG#wgC }K]uvvvoַU?xӈ՝6 NDHW.ye.&B~-YMH_o3 T鞍Aw#zzр_p|*y'@7Hۻ[ `8alɌn8 ^\&нϱU>ண*&@PYi%pH_)̐>zrypByP(chCCbH 4aZ8B&@FwuM$wu )qEZQ_pHQġDJ Dkz+ɓJ("44"'oC VP 76( qP&zNTI10uXWr\shq?0Sjΰme׫%u7S58?U )|>%E%1D)hp\o3A:R3 @m1|k;mp69QȤ04CW:#g*H-Մ 7ȝ9O08o *W KBtʑQN(~PMEh:-\XdH#T'FcN3ݛu{N3 dX 6IPvGU}0-㯜ǂy,e mHb@h\p ^[z7b ]DÕl^{3#٦mewgPq>TKCxu~ORQvM{;̨ ++N6>6=/0vܻs}\ i\HQ^)P|9˪J>V8~5i sƚZ!2>] 7yëvv/nx4hrGM@ :x+(6;2E'HE'r<2GYn+[`8"ΎlF<>ۨH4iDFL!"ODHuT ߧl7?&O9t6fyՠf=!p䴺timyo1V֘yn;B7v?9kabf7 nZlً5KWfGsӡ,T+(AE],]t.ݵKw]t.**brGLEg\FDg3v=:(WU fGg7<\ҴO{EAX1#e%֕SU,ĥ*_ޖFo3 &iiDH$EJ"x.$eM !3b NL0ǃRk/PL$bTm L%ʌbVh5T<&b:vFΚ8+놈]?lմXXXC9>Td2f.W#.(PIčJ ^%Ɯxׁt2T&5 WHo5E0S1TɓA;#g5/Գ^nj+ܟ}UbbmI0ݮ(gr[, 8h-pyXA 8hZA `a-6aղҀ{ۇ7??s?~zU_x-}|~:0e8ߣrLb{SN/4;TRUi^a9 HBQNSKL))2찀vX a;,`찀a݈`kvS}wn1ًɾ*?Ir6p(S)+;p<0E` gbk&Olˡ{| %[y<вK %Wq<_k? pؿ|HbJic!I&BUJE!9I*XK(8p7}SWA f "e4e( 0.H)"w 쌜,s+4^um.leE9gOrNn%Eim+U{7DO -f;p/}(;.W)qdHIDOL9z,˅2J,n[ɨ4s$g+C┡^{-!$X0 1(T $%cgQ^TKyYx,dcYh,ܪ,$M\Ӌ>6 ͟ `02/_v֑ȹQKP$;PApBEd(,Zb@ MknCl`;8b+J8%0 GIL(J>&Aw,;#gva3.wEjwv:vDj7]46- ) aĤ \&>hk2+ßm0IӒ Q3h# x"^2 ?.$G:i3rVڨ%`xDKDZ$b]+)! 1gE,\p5Dh2Ps]w^`^ NyNkStG=[OTɞ'L3m5(S@ٯXaVTIQ 4}* xN3LTp;oyLQ3wV~ 5hEj4s.srp s|;_<;S5MֹA9]'br pCv)Ry;lg{os[ fpgNk%?^TqbLj[F=Px.81Uj{ݛ nFv}ͯ.O+g|7|fi?>-77{;(£6#}b[AHJ}%!tֶ k_x22bQ-p.|4|nfYriѺ*#W>d۪mjbl8GGF>O}5Fz#~Hho4 CܓcyRELLc2Y8'x>wG_}tD>zG~}_p.h$IF&[V]-M҆u͆[ֽ>xm0`gwk@q0|'mFAu'nnMYzzaOld `ܛ6T.U]*jQx"qn"rXj#&ϦijlYF)(o4 iq]u~H1ԟQG h@҃a?:/$)ura*rEI:gL&>)p +L8#S\լ};^M :Pi  1 4i -.+y<: BIwv&'=Q{|nznzCX)ʘmw ,*/=2Zz !x= :Ne{+;@1W II FU MJX*gH? S@7ik(w|,fO=K;&C* "86)#b*+ F/4[<))Cznl $I 1@)O5Zks3P@xb]tEj[%|8K>x5Ɠyj1]uVŗ`ԞR3D蹶w]>^Oq]4GJRH]Dr*FmRS+Aɔ ʣVV"0"sAZ)%ŢL<ʘʌe2*2mNw#%jDʇ &(d@{!Q'DGYMKVPV:#g2bt[{,\uK׊u59YizX 3|{ϞZ| ë,8OepL“ {}$JC+2)m1,Ŕw, q.fӍ4=B>wG|/;g y ~[ yO[`Ybcк/1C2^5{ƫ-0Bt4J꡴b!Z2t!z#l.99M-P]NXg8#K5(pU: I>Q\QIM|PcRfa  eщ(~SG$ |j5gPmE׀Y]Bް] vޠtS<~8x)>}+'/99Ѷ1:˓K`#̉q~'ԧyʞ}Rj/3 xx<# HY+zrD#cEtZԙtX fe $r@s "|vI u0G'7ev:V.Z.^|SJm Y+/?SGvc|X4Ϯl7=ٳi[ڷ}]?˵x8}֗;_KEsۑ.M+vH OJalɬD*S ./RD+: u&$b@.Kq{}Q#}ć/g?:swhA(#q拝\:CK&2dC6\fc̚ϧCg{9yYrh3,*]wk޶O/ncZԳ 1x7_ nFqr Kgnׇ--Ƅ:n_1;ixx@oKW=ݎC58u໺\(`PNr;Ԋr6OTJ Rcg)cM'Ybҍi܂0`DlA|AasאlԻ9rΏ^ijAb0 k[{l;n~qq)`xt GmVep'|> |Ab,%rlP!g1  {=v7AIpI:625B*L&݀<,Dֻi[D~P_/ɯ;_ggŮ+Nl$Pv( HĬ\T>dɠӮ(R%Z3:$< x,Z*v=+,` Ia$6b":tF֞8=($f 1Ӛ0£q:(=I#=}g'tq4}]c=vI.Q)ٛѱsb%kU"gb&CoPJ蔔Vmjٙ =CQ& d*"oJ;ݎq)n=ڇ"^D=qr!P"@dFC5z$~bGF[i8CP΢&`_$S|v87bbQdN blNEu}P({D9x#n܌9`Pؤ9YKXev2Vi6X'k  T[,Cq 5QCr$h9s& MV)0ϭ qLK_T_b]㬗.` bauy }iS}k| w?52N;vQx9ZXr[g~]p,5ӣkƵ?d0ڂ%m !ޙV.YrQkg0^EI5iRE YPEB2AsVfIxBo#fP<>3w`{յ 2e:k}޴adqx7oŖm.{DTyA,:ϥ8AQDi79 :!ڊp)k?Ir(OC6)U|-9$eD !ȂA`L3q} K:J/9P$N4O^8k,eW ] K 6K`|ICnB7E ~ƲAL"Z)K_bmsWPy}Dy+%a,#Ԙ֛G0~Iy]M3YbWHa߈oD֝ 8* V+o*ܰ WtؾÓ>RL (m(Ψ"dT@!i!@4CHhh^^^]bj[)]S3{fuW'L@|y|7kF(ËOHhrET4$= fk7/gtgē|}8o%g`s6/TV՞oz.}4%_6`7a7u d@3 S~px{UИRM¸YJr7)tvv9|2 tY.OlXQ|Qlmb˫wj1ԩtPR[i l4۶qզ'̩/'/V<[v;oB̾~6Vqs$+vN>hˍ~=cy{Ddo::6s]B&9?"!nle0>3Wc2S?È(>ot<֝d]h סr쬓rv(2QdXq86*4~78R)RA(oK)HkڒSJ)$WyX o쾩 g/;tۆMUQ2Eu3D;=ԝ˾v8VbFi?N2e;vfdi@AG"ɛB)\$ )d -2(]d/MP (F$Ʉ搥LJ[!! h1e{BKkVӍ4=B~(÷E0t=۳vKt3ɶӼ><4/Ơu5k_bx4#QʱU\0ɽڈ@l줳3Bt͈&1[G{uJ1N'gǧ T@N*ʤ!U/ҶL"h5aNH;4`/z>p|  e(;O@*"#G ˊQ@.96:J(QJ( 4)v8RBӀW`2|C"Y=9mqmz!{q%?iS#sjcGW)RFclPȬbvy>Y?h {2*@ Y?Tz}҃d"քJW$YF'3Wd' @.*VL) ;R[C:dY3\YW0%?mz ~6ߥ_r0c?"}Ԣ@ZS+z}˜Q{bOre^`覞Tar3Z\m_^\GSx>b-I83_dDwB}_i-u'F_&}Q?}S6Y%NBE$KOo `E\|r1-QU|)_F$czV.o.N.o~Toue+m#I`<"01/0򔸦H5IY!("imX)2lt=r1]SEۘ-6N_YqJGHR[ڜwf3#Z1S}tUpvMj0'jmW EՆ\LNp4|5=I6? '5=F4?FǬnĈ1 G,XNpmГgu󩂭bMOmJS'ţhss VJDk-x=ud!}[`EBdU!Jxit,0:Tx@ ՊZ[+6mՍq[WԁZ-=V7&]B][iɯ0t%Qֻ8/Q՝׽w+!j1ӻztvk$aE/8ͿWѼ%\p QY{LS5h:rAKǔB-CR:<ˎ)-.WiqL*7 qNG3 FߔA w^T,ӿ^v]j[ݶmJz--:*b\Rȭ518enDTBu)uf*CV sk YxЊBn;5raoJCjiSZ;y)/M~&8;r8kCPHC5R>`Q# }]gngPs<a4L@(ɐ&jjs pBhExd>ܚ YN\1\2N\WCo~N͑b_ywU߿y!m3 (QbJ GGoS?BA*ēinv21'*Z@IK.: Xcuh`=(щ%ڗtHuegNK@|o Gi'ph,qdzI6*f$!dEuL72Xo j ܄PGOu4Un|dxy7u_˳DñO4CGBݑCF t5Yޛ҃Hom%yCw&] eR̲Dp~:M+j>ʃXxzQ9b4 {St"Tǹݛ$M~n~~. 3_v U^]n[[}~ "_+1AvZ9hۇmi)&>BGBWR8Be,[X\a8̊̐A1=?@㽋*zY9\%B 3&i)MZ~ KC+MՊ4V6F Du U`t1/̜gKi`Ud:Z@=h5]O;1>$k;(~5.IUǶdz: QWXOvm}UL:4шTڴ_R̅S:b4n|@?M%XE,1L)Ee:~( ,[l6,XYNa@AB@L$41D@1i )@xi !eɍu̻ʥĩp 32q!f\fe9 xw>^0Q! k]+t=$]LOgl}tO.[C>u9r>:EQie%1C̤I1:0D%Jglv7YEOZ]Lq6=&LNk-%1U̵s?߉>FTfyhh :EQ+:4K%5;"`ģr8_+Qk;t_kRvkUO;/·@@j:>dFHU0F8LV}70pVq,"nwf Bl'"^,Ķ(4uE_WKq d?>~_VSS%=ݑ4fcHV`2 IO!x[1Ly>уAP:@BBŤtgcup23ŖȵAF=7l%V a(0SyxfUDp}AE8I5B3lxi,>^'IDs w15MN#9 HBYb|:tJ.%Yf6K"y_hu(Y,zZ_q&tNm0 'T_FϿ`4Wd \s=d*+1K$g)Уi=_7t~mc"ג6/VCQK"\0^sv뇧NtyA߯_z@;n|u{k䫛|+w%Ky ՜7!mf}AWiM{V Wf'ܚ/nqtOuL2c.1Lfҡ)Y:ts+$QI!X$r5KI֘CO")TZ%<$PjgS=v0nR$0>O 0<+Fگky_c^sYazfI,9pJpp>f,Rkljk/k$1x(D-KDIN"܀1  DBFGʚeë@ȼI&3U$9TΘ@q*16+e r3>z&I%ŵL&Ҥ9Z۲Dl%iWI.Nqdlml+erQtrv;x(q(pˊ9{a-&´EuOm86m(R_[UVBw@ҹ}{^zxo^Qdz$RFe ǽT)})$f=ܡLs.GmJ8yyu>H>NW ڈՀ;b7,Q1&ꤼ"KPL0<JАv{%PoUYVIh]z6BCYH Rk5X84aBG 8+nf ={ѝl2y$׍nJ棎!x$,pm0їc_-vS7=9*ɆJMO/w|wx/Ӈ?!O?}'}\IZ$s'.yZ5476Y7 quS^3}Qikgk@CwOED)WOInRzfP>ii}x[}vWs IAU"S[J F*+IAϓHI Ê=S̓hiC4HxԐs8 @ƫHrkN5(@Vun,3jH]Ou42QǸָ<i%N uyHd-(ٟVPw|\R˛!G๐or=*wZ./^zM@H/*]+JIj*+"@պ rQIQq -^TDH@$q_jΕT%Թ#{r$m#K1 4xUXZ5  4q0;c$:'-uڵ Z#PmQn˚x_79rjfɞxxiH/TtX #G|9C&!Sɡ;25:\?2's}nI?K0̷!2V&s*FJRel ?2CVVnR=PyC g'ivćQVE>/x۶Hqd:ߙ}_P;t~eW?}7?5h֮۠v ?/LݾF6_ F{ۢ.wmQ _F1dAB |! |`%X|H-4}J*QG$`шL^];L.EZW5t.Z񸐿w1V` hF"*0" uFր96eo#~@m۫'J~p/s~\$ n)׿c\wa58sױ__UӽRiaYib@41nX+!$HPMF^zļ>ܛB>9E98Z6,F:jl3jR~̏޽)VBS9L`ˡ;e༼6sMPzFV'2J@fVDDd!!2LA D䪣ZA_!D8#W`s<= ~sr1.W#QR#WHbǃ2E\ej 3zJqaԽ _=G om9ׅ^\mNT6LwEѧ 3wݟ} 1 gaT: 0_k1H9/^N,^R:E ęMT8(<9+ǒ1pI"]Lunkhe-R X**-%R9`R,DNC%n4*QhEAOĢw"m Di b `wpc}өrp6& ݛ',.p$kdUJVI*Z%YDҶIr%RJVI*Z%Y$kdU lbhb%+d٣)RŦGGᔔŘZJƥq)a\J6P`k-vI( 8OR᠋[H)V>+N%4fwdN5GiT/4!pTH% ɱddj%uRLטi ȅ&+mi֟2Wޚɗ~, ~ŸZTI+fȝuv1xBݯƔИ $M$H/ x^M"(M ,FkBTn[o*x1q]6 &eF2 ,Z#ՁHO}wPE#g}G͗$0{qdgGk=͙o:n-滭z0.XwA*i8 !AH'nV!HP6 ;+j"98 Xj#jk!aLhR /EJؖ95rsJh05kG_%mCd6m+! aTq cRwU(^שP284`'ҾBk݉VBE5F)D;T`c (w[2 %˨R8&Ċ$([@r> l0#F [v'2&cB6*I"*SN )Y]MbmNBw㛱o&zCe{u;|YG"FM,Aqlq^ApBE%`(,]Jզ2deFs,6T^3A%6`d1t m]bA?,"["]캨- I„bRORK\T mI_vH~0Y'{6q Q-DZR_pkHjJ56LK3GU׻ 1,@+JTy#@ 9C*P4Қ$(#(||L2B5hW&\ 'W0#d¸/ؖ"+%7A-"QƣCa|U,xƀ Vt©JecSzdM@ %i&PT\ݬ0E,zp=Y >:m"/LYO{\AFePƁy!2Aw'cupHۜ 3#IDn{ /zȷ; ax].ା#j3YQ̽;ޏ/ նcFN'LÎwQgbD!'߳n٭ܡ ..`Ɯ(Sڅ#J(" sHX0lt@0uQ9m^Xā` 4rzb, 2Z:8Gu> 7x\d;̐ 4YsN8zV{|w!owF 6jΈ'LPeT22{RBQq!GFў x&HQ\!. ZS(0<]w#s (0i5B䣶4EKbi8gAzLL 6܇( a~jkV'%r$zSү DfqBAՒ. jO,"$ Im|&yô˜{YlKAtI"]GhQͮaprTr1ONHhc <@hH;\x)Sc{cS`׍k8z!vj\ Ձ/V*D˸ٳ$ˍn~h+#)ix he+"U"[ym&&ھ_ַ̔˹b@2Dd2DEȡ(" NDH<a U[4+>rOWEt;|}yZ:[%sl7s?op zi7^²E+pƦ)*H)<\1ϘSHgq!sGR} ҨhsZҔBvs,">>M7_C׏w~^ź\knnͻ.筹r%hI8ZZ:zjf )xuռq9dz5.ڬv/ լ;Ėl#;6vyx}M zryIOk߮soz6ٹ̒cƬW$G~tSݶ*mӡj?}q֮%^6!\Y2yi|}hVQYYpȷV^0_iiw$=>KꏶdtZWFhr| 5{T[F38~0-.٭T"amgb/RJY I߂lkTjJCꂦ,Lml.aZ]6R·W~U# ZM7{7TUQhH$Hveڈr J)bS) HNPX | wsحɻ|5C>,l(:2jan6`ŤW_UmY5c#f/w$C;GgF1U'|b8қ^%|QWjiyؚ7m[en_4>='*wGSb؝[k,6,%jvX“\7Mken.㙶`e Pp 5kUm㶁3l7kRfcxq7kx)*ݸ3%ږjVCӰ\Ffaw[HqR@7 rKFr1L{k?KP lJ r R0틆089&<\\KbgK/[m{ZfsI!;w]_59G}<]L :W=)DuUXe$#NrB4^Yي}+XÃp0"z'ߨOY$< H 85LĘQ&ebmrFjr-rg<#jD=2:#((UY8O+go.Q,ej/[f|n+(n^-lgK>{tr} Cd_kؖ[=K|U`"id&J-xдD +D:0'D 1r Z L, ݊ 6q ֹ@Іit\9(o 9ˌZF8Ԃ?3DxOIFM5;"NiTH40Rx;=9X"ǿ/To˒/ muTb:(cȖD> T1[fy&`~5 )-d\p'N1kDb0O! 2CQFٹ Y3k}K=%ĿX=lj4#Lf 8"x'k9fpG $g0UbOV ŧlU^@Zdje("_d)UE7ZSwx ׃sWfpeߵ5ՔۥvyT#4$'8!PxbjS%'xhJ'GO[sjZͳu|_|{yq~o? \@EL:AO'OmTqvh< mYY7 ɸ;ƽ>x,ƭaXܭ5qiߎwټzbW[iyk`An h~VK.jv*YIAb;FK/}=ːț2J678www<[fIJJԝfbZ[f f ~es!lGQZeЖSY?u'"T+yB\X59b>hsQ&IEHJ{EX &V[7PLڑ>W5=[ EPi- 1K cF-B3 Eж7B1BUB3U+o|ۚI/J#{B쓂,Gy1 L|dUaZM_xS{3wM޽|p b~iυ_Xΐ+xg\_~i/n4(*pxQC(jg%x ݻs޼;NS>W@*Ք=M~Cm|M+ e\kP~?6o~+վ w..ٜa0応 5x(-@=NY\o_yx9sUVi =&=Ğ bU9 $HH))2qE6MU w٥޵xLă4q:/m[л\n6ʂ8ߩ`fefdfY6Jhc6zTC9 O"O4HIb1,yFi#omw/a3U#&ϗhUR-.v 7\wrtxP.!RuTB GX9bhtvs)\w[DYOz`y1 "H4Y KT'Td|"J(< kHOJk$Y" vrrm4d3C$ $F:fk'a[c9#@&E'*½)!'LQmtN&x&HW._}ǔ5 \F>DIS[C|:))$Uц^ǔɎpw!U6>P\ўytMG=gɹdD$r[Y9ı(aA%GGژ!0<BN<`c{}ݸ3~nGXԫ_H(,rW;sw͞MF::)揪?le"!muOqL PP*JL2rv$0SQ+rDnJNzywY})y.: MF0X4jA&|~赏^1hs ֻ쓊`e)-1V.(nHlQ^x2e4R@ŐTEHf2A`V8;n αyH-H98s.٨@y9wn);?]I_~~eg=XftXwktqmg!765,V"b{?{bG*>uu7o{lKWm8} mݽM{;_d ء畖tC˫ gEBm:~[:ΑNYW>Y|W|e]}l5}~ (4 t6d{~d<'1ߞuS54m`DŽ;(;5%tVu P5a;: 2EXŤ ْd0%E%?Kuk: l;Ah6@ S56.{>ֳ9*w R:4d7Yi۰ݶ7 _TbN" y.P<%)5IAR.Xbd]Q5Xmy(9DQO9sR1P[%ʑx9Q526gf\6ӌ=PXF,|T,͹Pd;n@.gb2[?]LN?Mg_8bcDAZH^DpSb LɛcpbȒr*f8#}zbxQiUaS;HΠ3TPض>eJ"90[tWL̡v3j/Q{dƽ J{B VBYPΊP.;c 2V>1I<cS<,13Boa 4lE'MXI$H8R-,)"bHlT{~P~싈1"GDh!Df“$2J*#k»P` YRIZZĺ8]SD ^ mLb(`T8#'AO-ie؆iR >k!FҐ:iɾqqōZ$#HVGB.T."B9s6IgQPdF\| \ 6ӎ9%~n/Cpg:ct症I{}Z\.jaw*AACd;xq;2D1 yG1t$LM!D07R($ -2(]L*}iKKT!K_]ѣ$[HSpE6ǀL_w, 8. t+q.P^%g[y[uW.o5RnbmҲۓ2' 5lf7 <*jKv.8ռ#0 \~jռ'}jܰy{Tvc5_TF Z_Fs9M|{Vww웅e7wӏӕI`:sqj0gYFi%(;LZ.@BP4{Dfݕ{gs]Gw4ͽn)͋BOb(Q;J)"me2*ci"J Mw5#Wz>&g3ݜ;\Znj+Jقu²Ag '!%Y|%9j5.Vv9)pjm.^IPKM<$keON:EQ!}H9)i@+)瞬 qYh L*&G1FezO&RCAQE=Ch"ք ͋$c=]\QEExS4$)q"HjoRv>)^L"+~p<7O&:'Fod}mT',&3;7w\W=/x>ω1Y g^I@uZ7N LzNkӫ8 N._.O^$tkϾi4Y%6C#4O8nwoѴ߈v5:L LS9|Og(n_D\ӵQ֋ËUکr8Vcqh^+jk*kn<Ӣyuv?W~/h׫0zO29~w4[veUm뇋8 M0B^'c?uQozmV OLx(֕t:\'ΛwUb]5H]gsL9RolY}~zjf_z+yp:w-ک:u7x:|ώX_޼~Ƿǿ~~OoJ߾o~W\P;֊,?$y&hx4s~?s5$vPjjoE="AIz2K9сw&+aXHMj#ug$ hr6@Ed` 2d,˜F@ (JJ42$8Ica3}\Sp[ܱ2 oUN㓧&23Ot \0IKN%ԀD]dY(AS w`d.ۗ0w/izۚLMF0x*')XoG`}A) X`IYPŵꥤ,zS|) uF1s4( D EgTLZYq 1-˿?_`>6PYdfJNe'1`VY%uʒqa&zB-*uvhbS*Zz̽6ʕgQ&pIthl7g3</\q9?&Ǹ`;!6={\y%>?N>_?)? 'OXJ{g/kҬ֠NS<d'g⍗9%XQBG`9 *=x/A.x Ebo 2`!l6QAbzqD#_'pgnc}F"-n4 <`(G m>D "QUI+ ADzS16-xQg-!g^"`/"PJLS$[bd̈VfyDJcT4#a ό/Љ֒Rɬ滶.O+(90#yf$9zau2!`~tن4C"pLXnx byyS}J^!.ՃD$Z,+ͦ-}lxOT㏋yVh"-/SbE꽥Ug&6y=q.xO&rÍcqþ{N[2BG-/嗫󚇛8qosjAS)ْR@JC`F"̤9Q4"h< O ̯F<SIe{ANX`kd8~½ 'UZJi '_c6sAg=5tB}wQ%cJ1vH>v5h]rGr =2./#k7J"4@xcݧzD$0Hes:+9Ufgw;>펳* WIԮ-HY~L8㓇dW[u:+NQ7JձW9Pl%snUv|Nϐ] ؽ>A~|ѝ=O/ksܙ'2s6xQُ1R\ NI%<2(DF*mo'Vav;eeh%J D:CvNeGp6u}Ξ@Jc4.[Cq~^.7u9;z|P갼bo6ny`A\Яz7~#Nc+t2*XO|<6WvhקvFibOևt +:JJbgi1A!_,HA^' 7jgU 2g{+Yߏq#A 8kE\ wِczK^67T"SR4.F!I"Xv͌MzrVpZZ)O6c~zQ} -UVvɖ"e'^꬯O"mm}ᱛO1L+vaO3ȅ/taLf" p7OI}a->Q:M}aUb|78vs6}vY4Sy^gbrj^]!hl ϔzhGf~}tR8~T{ L6X{o~e{Vt1\}v%~89.}TV51.gؕw?-(?c]šUms2]}t'MkJku-tzy<]ufEE-+~{Xz2j{f (wfۤ@"ꀃ\fg"trwz`}{M3SWtvtСZs! Pȇί$j@淧{u"{V \RsNC4<%GUJ.nWֿ ^V /V !<±z(?z?w׫|֧XVW;jt\헟^Q?2aYOңp>CeGCԿU܇~ qx ,1߭rثx̎=qzA.6ح4D/?0+6f= 3[h?uM"ogڞ|7yUuW0OgK7Y{%i?gήSN)xw{OGG_O7W:[bV߁Zm=pS_r:;lH՘#MO5wcQ?/c~s"׿{[uh@'v|x0ͯb3\K~lяUZpH样H9jV(x[cG>^9v";-*GK6ѕ%2\ڧZq?~GW]]uVβ>x9_iDž vIXYI:m/(z\ r6N=z{ȚJ\Y{whgPFj![ 3p_l3QG>=_?[+<åᱵs➭f!FMքqMW}>xib** ʛs{? ,#u<iby-b˫bx#?B}9jD$mS`K?OVg+m֑~a?\-wLYg{ǵ} W._ .GkZY-o7;B2frItT5ZQ/l ׃>ߗI1T iPLbQd#.TL.d'hnNC?N`k1RC;nds)jS sHLjhbSIkɸEke<)ѺZcPO^Z FRyGQkZѱ(I%cG"B o";oTcԚH.ɐuMq1M* k%%q9VZIV_KC.XA1c1V#1dJٴVkΜU8dF2s`skeuqHh (SnA"6l2١L4:(T].zr@biͻCs [ oJ,Dᖨ+Ј}GKVB7Y,O`ɐXl+_ Ĺwޱ6ܝ7G!^UY;USckJ̱6's F[5 ^5kAr-;yۢh?%NuhdJʦ VCKq 6;KFT[റ8TQ'm5M3LUqQM٪\ -6r%dP13,HXF*H`jA 1 oTttC`ڨ/~ E2U1^SN ]BoQxJ4*笂Bzh ?jvb8(rȩCMPæ"ID]єF OYVhl NJA!O9:C2+ЊS~YTuAd"F_\lg3FT} 3!!_N#֠D$,)FtY26 D۝B6zp7_&$M2: A>5(&**͉c̡:C9i #N {01s7`y/7}·g^\4%QNw z m|k0xՂK<KJ JFOі#r2@]Xo,z+ &--D^X˰) >@H&=0:ɨ2GZ0Ӆq0F@0/> K&Bz<-x4 , M`:*Yȩ ՏoT?Ԁ\E݉l\Hp%g/o꫐;>d|Xl]p=Wo1K&+tmm,"{ ԥv1MbRoF\ ^B aZ& _L)ڽ,=iq!T׎i>,k^AH!%vrm>ȤQ"`RRCj <~ t2Б%YsΤ$1S"kU(M6@qSj1pO*üO`AWC:id5MThPY]xOr)ܴ[oEU^ p(XH IX } ><%7"D:Y{s!@y?ysBw&Q`}?I7H7ib(Z 6#h(6R`7M vT9 ,FW-a{mZ0Qe ƴLE_r:>@zRZ4hpP05aJ eP؛\`MZr9VASF; 1lrH3 C)7 n+p}vnV(_V XW \!>HjGO:]F SCaLE#$q,Q݌Yw=guS[ 駀PF%t zJМhcr(rq@ <֤] TJKP~Jtd1r!% 3Y]SNG֒FvʷҌOƛP[Q k"` X); Zf*̀~"2~_H O4%0LJKF=:}IuJLU0]CC.2cRޕ6r$ٿB %}ݶw``<6㆑1E$7GI"%*X0*fEfFȊf9x&H<`wۡ B,k`L%wJ+4;Ҁ*wj0H*ߏd'yxrWZ*S$ ځp]%*r+o߶o35 @U%ߠr! ЋXI5\,Os9a*AUxTXU[@ p@ҥ3㠐 @VX# L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@zL [>1υ0l{ RZLbL d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&Ћe1JLi qM+I6qIpOﻏ)\'&WLFWiX#fn`l2JULY-xU;tm2&R̰CU D [Ufrt2}Ozp9wJd"@,jG!GGM(,O5 "{o w~Ys%rP8٬atߴA8{pI 0XlIk< V<߫jR%nPT(&og9yq0;Ꮫ;M>܅?FsW6$ͤWYUldBj36y^TkvyО~#cN[Rj(}>-..{`vP;ZPK?Q]œ,h_cnOge=z=+cVNF2G 87iG-ﺣBhsǩ r.ݴD ][[?ҎW;]LjY>e^c̬Wq#4yQBv8# x=\|Ɏ%!WqN?*ǍZ4{ܫi׭0KU.}XhMnLoyEZIab)=]wi0W.DG^ҷfeDΔ}c.voڞXj.KuaXJ{+إFPrTvQwxWH7Ŷ}֥/Dnmsu1*ڱqgGi^w䭗TXwcJ;uú{Xwa=uú{Xwa=uú{Xwa=uú{Xwa=uú{Xwa=uú{Xw'&L ĉ7L Xɥ=x&'J2dD&`5B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d}L۩3kfGo%մNOEB R5yPΔz Db>P$7%9$D9K/tcІ \˾H89t)9W/d!=~(dZ̻k˫לAN.Qlr2czZt]iSi*毷\jz7~#6g-/Лo{YL <|ut2on8B`ߑ3f9[LZӣ6?oQ)_zkj>ոLi8ZaAϫq9:5;,~LƠp7NS&tT}QՌYe"!muWgGV3uuLn]2t( oq {U٧q(p~^^_].-GXzɜDSʮk%VjN1,YdnI0! 둫X7bWU,ڃwP t3xTU"W/*Y( \ھUCZJW}z sq \wUT%•UsXwpUq \U0ypeK"pq1>p9Q WYURWXwU7pUU"WEJ^ \ Ax+<$&|fRz*ȩRi",f1o`?F[-&t?IBZeqofRq}%Uu1W`#c(B>?E`I{Mq7Q5H&^7!M)#AU?oJΊx<\)EzpSVGxjާ%lq5 VC"AP}Vf8igf`MgE-b-0n~{7=O Nޏu_~?c1ꚟŸ!~&G* J8!F*N[qBqV]," C~bm>|߾͔z'l$҃u+,{Kv%UʶzfEr2 ePXf)xqVL8.FG36ss!4L.WЀbkF>=KnC^/!oxH>^ԆCQ T~1yVRwKmnh726UEeEQm 0/J8_>ףE;­k _ 㺀MuKkwtd-JDwl&_&o &y<Cn4R7 wrWs ;t{=]5skUWWb\ײ^40BP?]Jx9w/ʲaĖkf!o}*6J7+oR[E=!V´.+§Iz1orlNr-hq69O- LF}O(4i)׿ \|{Ǔ% •kEgg5pZfC\_ݸyv}?8/:.,\~2H?Ƕ|Fj7Y SNڜ5,])\X06*?ĢWq WO?/ -ht]fZT/(ףOZ?R恥RYIT.Յ `0U& ;r[Al2B*92qe1qh_L^t%&2g.S>%"!(ڳl!\Q>etNgKI 1PRrD퉲 ՙr]Ж;+Usgu'dP(0xy6UEBx862NsMHP*`^Rʝ^`3evjI U*l 巘 4$H4bzA"bBdžڙ8j:+aT5(:M+?ig|LMEOM z%..,-IOuf()zcos|kkkT"lڛpQ| H$"u6Z@MLy L9f2,Je'IȨ1fI9L3 '1[ϢK`Y8 |,*Bؙ8#cwJgXؙf쉅tIp%Qx'Ej.ֻUNnvi/y7~sCFɇp6;O%Cɮ`D0Q8dl)"KxV)bsóa(pg6A DF!#f&lV2 t؝s?b8BCAδ@P{6zےL9΀{e˹$͔$6˩ Bi4Y nV&eN0K7 33A8$@P:3q:O0ǾH;FDxg&!E$3xtX^~TFK.98ByKRc !'Ng{8Wi즥Q2` N0"PE2$%KjGSf^UW[YU4e)XbZ3"|,se4&jˎrsUl<93F_ D41h}tBR2E=B2jSF`L3vDgl\2+{gn Z:Ԃ+A:@@{K /{*r麛>*▱ hy ,3tny$<ٛA/#rz9sȾKUp;JL81b"iZϊ;c0kaTW'-|g?1 `ϓ')<_x~,e BA= Q{{sݔ/+)s#CԘy+pm6&xEQ6qY˽Vr%rHctmKpE nNp;7p:-}&}^5gD$mwסЦS*0ރ'rhCbu>i{XNmVv;vjBZVwjyv!dsEl8:NwNEFVRߓ\R,ol_y&'!.ŎG,z'j bm@J&ڤ-Mۖ* +1%2{Pu>@X& Fj>E&v(7jl\ʻʝO`Vo2,h(J%%H&,LJS;'a/!re ` 0Mt$FID{`S[/sZ"wHK̈Fӆ`"{&/y JDcDpCY,-,z 0li14F㌡gm9sZKnH] _jP)JŗwWYF/5 :,\iCu 1G#,5 VԻȸF Ixa<ʷqXEPBػh<Ʊ3.p[˱^/ gKΓqݴ/1s/JaK)>2("SDPHQ8!7v0}qXE͙]4z@QӒ]>N`> &x:e]셳qNX Ȓ^ı0#$$^򖫈nxSl|FY S6CcOU*W؞CW4&M>S†n*H_O;ބ47$,R?z 4DRUN#R0nHI_>Arb>wnYbޜys46ᗾo$|XnL >L;*^`oԹDVnqe6Nח%s;5GWha& ?<Ib_ՐJ'/st\vWoftɓ?\'+:">uSnSc(Lts_\_Cr(q4;&Z-= 埫BC|~GOPtoŒ\홄R?ُx/n8c36`ع+Agp$'UDFӴ3hy)hMО$[v͗5a.6;$DD}VuQ\%N]FZbFR[Ec 9 xʊk1Aؒ Pi@e=[Ė*ё̐2Ӑݨ+`<:-C y#GCVZ&F H1F¬H52QqFm^#L/@/Q8Tc{cP>M:wCy1KR :چ (TXKE4T_]ż%$cC@KĖ⥧/)~}3Ɣ}mH:ȟMe7;1%ioa'r5bOnNT܍J.ga 碠L .E笸șFQ">$:ͷe,)|PD (m0'gвbl} >Y|[Y2,'\Po(l^gYi(٤`PdN^T4$Voф?~&ۡI3" .'>@9`7f_)a0+^6y_NęP}X[M*ˈ4ym`?I J6~WlO ѠX m K&XL<.h'ra83u4D{S(sVA-m=ȽuH Uh8 .9|DЌod97E*H'ɩG,fO^ϮGgw2^-c/I yӊׇuDS9$"OP5d/|Ԅ rfP82Րon9wUpy(}[G0g3^WQڅQ:K T猴mYbRT}Y7y _~߿!_?.0Qߝ_|xn K5ckI;N]7J454W s5ir傯3jr5?>pL-ڎޭcw=}gR—ݸ$֜B2|lci:/VK!4wvfWۮ/x Y} l ҅ n I-~enݫ{׽^:{m{nnjQYv !Zk?yovjݿl[u %U'-$^roL@OM͕sciDδ1xj?Be~=ylZ=ۊUDI¹$!ꭣ! )!)d)eʯHz͸Ij%?A3}Lz']<}j69^jMֳxࣅMBL&'!IoFN;i7äߌy1m_=ƌ/sep1r/DZ 4<0smC)mF9On??z9*aNMÈj$0?a.+sZ.kJ)aJKlU"W®Z]%*޵հ+ J>'t3 z.*Q+бDlkdW SN3bW`% rz6*QoI{wvdeW]1 ){٢y=tunfzaeqzh~evm6'M=f^Nccaqٿٻ6rdWp"ÜLvg`ž(MdkI<ϟbdEjr b;j,>VRT.WoooRhU)L>/g#!6!+lyF8oqj "4grSHr;uD3HS8VҨ_˛"W])/L?x͌zɏtND[_b LD"|z )eG2Ig3n i=g̽N7]^l1 \CN}q\"^{1!ZA4~D& (eb+#VS=[JI7Qt_6fI+Xi'VS!':Y(NF$c)x;vKCyG+>#I1&ekԐpXh{n2eI)S H2(e t2Q9/=h&k(etcW"l@d(1+)9 D ѣ(>HsJ)YF|>pkv申skTrrR;{9d4vX ;)U>%TR -dMR},m H)DA$uBKpAzKK(RErb2B?!z,ր4 BLxpg{w+)3ЧFadҐ %J}S 9)v$1X )`REEhd)рG _J;œ)gkr'}),Y"RX${J ɛR+Wgl8毁c|~v\P76var VZ?l2]d7oPM/>^YΧ›W=9=o לC:kENܮ6TUZ?!b>t\mkfVz}<.ˬO;V%.̵uj  ?\5_+^&5-s ٯ>Pq|2ߦ~ZI,UsEM>Vb_y?|IO4G_bseS4j'̴ѧq!;AddI+!b=pN=7+ط] ٷXP E+*҄ lJJ$]Y8,zQ+l%$4uٻW@Es&wg#q8PkͅyBBSŦR1D$&bt>&+Sy -[@iSʤ *ms,f0%.vMԕ8j~w[n;`+wM۳~=1.+{r}تe;| y >ԳH4OCQ( ! EeIՋ%&C]DMÒR AhֹTT =9 J deK`aT;gi*aag%*1  o_T~f7\^Nhh4| _9b "km);APeAt281dlYaqG"vC ()CPZU`! vg\DD1bw&È&t-L;{.,ڱe/AeEb2*(5##_޺\La1=3QtkRѪ\82FR1ID3u0A}bw6L bg-"ʎQ# P(t"B o#CJeAhMd;EDc\O-bQꄠJř&f+Qg#G 8D ^>;aD>ɥ}ZLKE1.{\\]2%MľY*2fM61B)jQ'QH/xؙv nS /b:\N?5#T=b_lYgv'FRj#|488) vxe@iji!vh->avg= 頿8z4_x|c.I]0E2hc$ 5q֦ "H2: }L>PI>k(HV|ɐz|/Nx-7j |(k@lnF*oK I{0`r`@I04:% 2"huٹ`\ T(,ߧ߷ԭ!mDYJ?(*(VbJ ꫵ@Ee e淁g<񿯭Hyap1RB;GsuӢɧv*|I-%{v>Bw5??w\OڥZ~#QO }p+Ӻq0PEBwh:tOpz xrG?[|w>XJ>E!r`6]_\DdݦEyrIi_/648j7>)@2%WkONh=2NqRX,v{szʕQ}̊]o\ٓ?\svWWz5'y8.jl-8o~O%%0t+,oSOZXgxtUÓ9X٪[]trUUjn慍Ԇ &gV}1e*w$`2ScR+TjY'5l8={Vǿ7MO?y{,;~:8ϱ^)Y$"͛vEӬ4U&.ݻÓ̲6ֿ~88LGCG~5krTy2(! t2BB"=@(;lL>L':|$l|L`W^#s h_ hxz^ x\hlwq AQH&PP(5(>.$_גnujvĸsϺO wjAVNْ)55h'!y"(NQ61RAQ{Ic{t`v2HEqd/G:EW<F}B0۹+͔5tkOy߿yg: 5blݞ'y&x)G~|< b@gEt[?eoRrS;Aɜ 9'BR;C):\yR6+;w:$B6HK2}2C4V$Ůy bL<|/g$WЗ?6=L>[_a η^nf'Sy,| C&(t,gME&(*' cuNhzx`hޛvmERE੢^u>yt7טҾ?<.u D& (>y$ !笿~Y~K/l 2IhD$JJV+$d% \b2I1Td-hAY -]Hz ʖzPLNg钰Q8bww&ΆzxDq?_>$myaaXV֟;0'x&>~oїO8zYY f6@ZbA(@%YD*L1b tUjC  +_9tN2YW(2&$t1km$ǮE$jh$fAc'`1 yIcYJr{\Vlɶ)<*=<$Db5p?Fă4wN7ϯ|_R Zۻ6HK?:[xRJVDT ^4qGAQ T( Ֆр! iC=~+<T=y>dٳXIMh%t }H>q|BjP0"ρȓ qa%o XpIGЖtKeDRQXbټ$כ3P[a?8#}^Ƈ#F!kQt3zs (ͦ *]Kgb:mӀljfwEy1MEF.WW]߾Ogu|^V.iAI|NkŰT?[ 7Dj-$Lqܮ7??\]pzuq˾"xs̭?O?D06a}}gjY9߮0GlgpFt;={GR]Cf0f=T+ˈĐhDF>lqc4דi4Z-Jݽc|Gm p5;ޓ ξE꫊.ӗl齣]gS޾njnm-v-m.=w3XY}ݺ8tb5n 9ݾǔir, AJ)iJEԒq/YT$)bKmD-\fs)tnT<Χ*2 A0#rE9,bѦD3DHW T^bsޫ>%zC\\lP1ÏCGkۄ [)*܂}%lݞ&9O .q4:\u(_%;(ؑu|lHx4"ENAA< :\ЈSMx¤&,Pcx* DKp (bʪ,"NSa[L}<[\.wUB\yWno!ຐ Clۂ~mbycutnVILoW)?k 4FbC#"`(Dj3GVG#^s*;6wgPD*L,4kNhs%Or/cGg)0"XsJ%6h>jSւ"8Q`"sT8{Pi?F y;/7|UHnWf7鍲\+;z#v4b 1N;#D8w,`F96&p}#Uȑ 5%L]K3JND!ʱ,viM6QM%\?uy8)t=}y ͷ8-]^1opE~~ֿܢ ~ȦW߽\L7 YnOOEϱob7nzƲAҨxVn_n~bumZYhŶ27bZz˴z`t[ [a:,l6 _8iwNɗzp(r_tv"Ae:q5G\zȳaI^&fA\^)Q6DǀxYFs;%';~uCONBKKB^tx̳S7]dv s7wr{/-6hw\G}8U/6: _O?|Q˻6~gq=;?0G+JFa(O#N\qy6?^padq&η#aϱdՕ7%{HPT#s5oDdU Hy=ϦEڰ)Wݐ߲!hm-vE, ZgYr|TJ50FY(ke aqr Lt-z%9Gn qަ4 D[?󭻮14g RwRBQ$:CjFў 8'HQ]!. Zu)c1sD&FjՅnv 9qʻ&&Kx-|JU3{:F:Qd%,wVz‡_s/ 8*|.b\Lw{0O(RxEc[ϟ[7]pN;:k݇u|f!le6zo>A=>wHϼϜG i,Wwt{ ,\y-gwSs˦}<2}>M~ Z'˭od3i;m3ݸ] y3aIoDi#2B!;XơL dgr8&A7٥R^SD%*}`(SVx.g6;Q>&Z惚j67@,^ul'TTMńjj\d"TERr&/ Asrg:~&tLh\T$1 ϒO 4DQI4bz (1taRVx6k1!-N/ύOYCXH"$Oۊša1<<0 {8Pnۜy?}*@E_c = ۇao?I`67r ^N8x/r:} k2ŗ++ן2i:< /RՎ ԋV2E8Jec@nk~Gw/9s;y癳?60; E-aqBANs>Zi010j:']Y:c* szHk<1DE:|R[ `Im$4hKmU"&h.-m8u-YA<,"%'8n=9D( h!je*$whALZcc1t % m^Thf~\- Zs"{DqZ *}U1qDMR#.{CQ~L߫_{7].'+7ӕ:y1mȾtYp#r߶IͦeӃ;X,n?BVW$#a1QyN>/ =w֛+GedIu\ UUFrI>2sKdcMP"uhL˝*S<񟪳ޠz\_<<~>cLJ 4Vx% $(1_?VM ͇FІ59z3nr+ƽ)>,- ڞ_쭥q~T|}~;z<4zfͧTtIq;΢M_& 2FOO&#'D$i1Ii͒sQ+;mDxk%wٔgo+j.7ho ˪bކow 7qoۗ*-$|GpZE,--i H ZB T'*!d檿p:ˠWxLD yłθ"ZTLFF+Y-PU 38.``*ˀ @'"@5 ȹgPW&rSڦ"If}G#ʫ?o> ntˑ܉=?Y[{[p\+#0-Tqh0`!dN#A$ORw`|'dD4IEdXns>Hr$JB0P̓OBraw֐'azV4F#T'P }rz1RHs6Nly2\|{` ͖%ya +er|R j+H0HVIXH\I}cRöHKa -V}G!z(jڲ빻 Ćy@dCx %nt |К i~j#C d3hJEQ!STzcQAeve *2ex##DSWܵe iJpD m~+@IIJ \P'BWܵ/kNc2HYqK\!vq<$ECLà \IMsPZ^S&٢8G*Os~_8%0bʚѠx8 ?C$h"Na$Px0?:=zEm@*"\,7u'ypڨltgtڭi_"<~BloWg"3N|G.+M6nW%:̵RvœuKhZqIJިsN;wՔrvp-f%%bbe^;ED2Ӹ3ul\iR9P( ʘlлeyim-5զR{f}^mt33h;Ev{n$EIOV,Pn@ߢ3l'uYO*6UlŚp=`ŃyY7Œh3|lXvR6x.쎟g"7{|P2hdWHƝP6ʀ溣PǢ" d#wNy0qDO,}y7 ;5pt(NTUYilΐ DI˨X4),a\^#J/sOVel76 |Yu]Annz}kRr|4fDJM4.}>֩;.sBJZؖ@Ғ$]2~KuӮ]݈/FmG-RtNgE z>zHj&hbL+ϣ:5 qMjG#u!GGxQ:e$PVH2qbP<5 q#>^"E|\ϵK!GᾐJsA?7هzmf=h `<9 fW_r02g}z2h&p:| {opS{U7{9烝.E>φؓj1Ǒ[i=cդ*tΓLgR8@qǢ:YUjkG-N; Rs^YÕwW5L֯F]!uedUՕZ Ze8U7{hɛ;`k @5n+^2Ϧ~Ubb ?/оdrE )#椉ūJ֟ R9)R9*B;Udj^gY=0@.L\*{>u'˯ܬD%aҶ%,Z29Uhi$xE! UE0Hy3̤[-y\(m3@-ܬ;Bk.4dH7N_378 GNs, aoY~6\z oUq"kdQ/Dj48G('="nNo$E{2E%x֎}Ӫ֊D+7h>8)i x; \?.cQEqP:|)UGqjhY4^#[*zpe+5epNQ}&9]u߫9 lLS?gWǗPrfQqeNuj]KClzRp nJ/~ R=֖˵]r! >Ͳ"tN`8x#Y5_=tjz+YfCRG U;pп/rYonrZ'jUs5T1u\Fg_9ڦŰrT!O77:Av>y~8~2}|ᇷ|+07J̣IQc1>hǛ][o#+B$~u)ro"@,z"$I)9S?ʟTfX-SUYYҘs<,JN _Ø !Ex=rZɗ+|˞|9]=ET9oBd*HE̮RLF3OrK<6jXUogcM!ñߵ2{@-kY?P/2$=o)⒧DLI<! J &{8Ļ"}Ļd[R#gPߴUEƣYpT0BRĩP)KoS>TsTF{f`6@s,l\璲ͽ[#s`5>Q&oͦXRPb\ z{o{g.)2 Jɴ@ 1k j8Ù4 b4F8F(sƃbK(.so@;X!$mTфAU:$wzpUqMO }{+ Q0v(rzvsv2S u^C%c&EqE' Y*Guh ]/>zzUϠyޚ汈K5u٬L#C!ㄤ2#n;gn<<8)|=݉5qš:^FNbP'$MK  XmGm`XV R<Тo&YXeCz(mٍ~ay^/y_q5.r6,c2QdS~qЮLN2"qi #B7d [@D 48U&9ϼΔ0"qF?U6.l3㤤YKgٗ$}4FhڭB{bvuՒu2?vlF?OuiY<ROe *`ۊ8VU1`1gk}.3-䛏:!iWMϮ.;O_зss$?#~ļHKKO_|9|[={Ov>[Ct6: ϷoHZΔӘ!Ӛ}Aҕe08vqq- f!wo8Aj1t =6tX_LuCΔ[kwl6sFW/R^Y80.9zPj^ a뱨y;M题mǍpwæϝn5CQ30x>XO;iMvo<))nB1A+"gS\FqIqOf4eJ"e∞X {7*sڤ8 @Ve_j蔲kI J's.\R$7aPzU^ IV"Lz:yG Wą'NwdoJ%niBV2qiy1O!̾kQHm!44^^%)>q`0?n5?+,s e47>.yXս_덒p=]Q]DN!? Ki0MQՄxFDO.sQ!^ 93Cx"q+&VqJLb aȅ?N.|=)'Q ONHVv)0TIWr2VG%*oK,@ &3 \(. Dj5ʪjZʖX5qɻhHXUim<[Z.v=A b}uRoWsʶm[mi f˘U*+l@ Me4R\񼵤.'qS;0Ծ,CmԸ$" ag&͕"R(5Gh*e%aͤ'2 `u2ϲH?hjA@1[ Nf ql ժsT.q6.7=zD׹4|fKM+e~*{7:bVz4tt瀌ؿxSz0iǐCY ul2ǓhM 澃uSZ;]yi4=F7tveǼRsFa46>F w= }}7L/iϡ;paYpi7{lʾuaTm2Qkך}o{Fnz!%6*yBZ0g%\]sSr tA'/LUVHďL)Ӈdph QHhLdƲ34: ԀJ qƨɆ>%Gb>mnNJXs;rχfy'NCM1@R[^Q`IjJqի)\=v!駗.RZ>~4Zߪ+ܖgӈB]'./g2 ,Fq{lB1FU"nFga5E5 ?r5-?t7kG-XG#)U^!2iJǟT(MVI8w?-wMҏHœ~tK}㧬jc|WM_2~q|N՗k5Ewsw.-YU:B݅;x{%Q=RZ8ddq}5Fߨ3Jh-*9.o㜾K㌜ރ N⇓'[l8zSn&g+\/"Ԉ1.&l}>!&]'Q3i")U- bJBL ' G.[d** QȨT.+2',D$L.Cyۮ9/gw2Ҷ|f,{B- >Y>F7b\/h'(@m^)+8J䘔%)h+LS 14ޅienC:~D bU2G9 }^+rhR3DFHPQ%yofHNWϯUgH!'-r6E" R -@B53S2 QpuV!B{9N C9!\'O%{l{TՏ)xHk Sy@xSɕ/@h ;bߘ 믧Pcf5i[tU=al:ĖPV0} NR$ 8.`NMv.Ov:$Lvu0rkmHЧ]W  lwonpsbbYҒcVIza)IpFÚtWWIr6% KcYz\B"S7jZ/}*k譙%Iw<9;ղa.a=CAΊzZ$]XA,1@>J,1Y'TXdDo-2R j4Z'GT\gE{ 2F12Fflp<WikYg8G.Zg3/y,.Ƹ(\pqvm0FyB2e1(z)*Wˤ'U )7.pq_w >2 axAl0UApC3E?*#5~|\'67B$RRSg\諓Uܩ0dZA; wiɩ[ٷs=mMnYFњ>XǑ5eWMkʾOG5WҋBonNf=wDG!쨣trgmaK~[8fwHȢ]e@MYΌw!Y$5wr?SSg }?8hZ0g~U|e+;7sf~y;sS/?vk|h;+YϗLz[Verd@c/+os*ћѫيѯUU53ֈh+vJQ_FnWO# jʐ/Լm U"\AG9e{BD!Y.9dB&-S %T:}|uvV欏 HTE.¼9џ5Ұ,SJ xv OC_Z pg(3nbvk6|J.`F<ՐcB:t;dJW}IQjM3zQ%o;_|uvFІ_Uy۽ miTx*1/Lv`ZĴL+1c ܀R%[Sy }B,`~TCV`1J!* )fyY}Ri# BȝttЉhIepũ芅,R@L4:E1Y9&^pʦy](5;KGHu&|q]:Q~no{66Ԟݖ㏟.t9ƅU1$$t R*:D<+LQ'ktf VQ0%t 6;,&m^Ii"*σDh]9G<eY o㪼kibPϤ,a"q̝1Qkuw98mKC̲(>tr9)҂ VSBA>㑅,"TI—Hjwᬄ 7l^>muUΦ5›?]Y#W69eU~t};ctawwӻW?l:= [ ܛ?vU{>+NfIX˞/[6êןwP<3$Њ'Be!Q`N "7RIxl*JHmRqA&y)'KqfX3cpXxk72%7\|ٙzCOO?L._8b +>[pS|h281d*d,tp muX$A ":t k;HvXb.ɶNqʜ҂n{ڳ =0egqRY]28$pTAZ#LR1Պ)|3\Mdf23CYEG&e(V`/|H$,3'v{Q?/P[q_<EcD"RiHLx"&H!U,BpA)8qcSDtV iƹF-$T"-KřSlcd%-5kH:ucDlfzD>sWyS5(Ws9i}ilJ^.HeQ"bJ'T(K (s=Q>^X۫,r,Mk}ZrS7o?{ V۔}A=sT=~z|M-mbcjc;.uQ/1\N7F0O]JG)]tag˗u,S- t,EKQ% #Hy!ȶ"0Ȁ`prܜF#5>')Zoe3yq~z,ޜTcwUX-9aEyY5!˛%x'V$QdyA&IhDX KL4D}$|BJHJVJ)K- (Aj_̜ W%SaHʋzG])Y1~FVe(LXW5&}CW{,HFۻid/  ,&0u%ZNs~,)nY-ŝNnEb. åhRJ*" BEYQ25 D1aQ\wCwX2F_ynq"LRv>:^!)!L5)#M0&KCoY{yfF`,+A9ý#OYѰ ´BQ>r-c1 %.p<`s-:'WCbeP;Q`QC:\(w*!zR+jg>LLj9+j }WhE1jhޓc#Cg/ݦ7 xR=m;UT=F{SO/?o7Dh6}B,[JL.0"4G6tOɢ!` lj(HR#mXhߴ{W7Ϲg&q`8ֱMݴf༔oAk W ^(dT&<'҈ဂQ$k8 tJhmPBNx[Z{d4z4vs}`у΂P';<S5k?6*9H)/nl^iQ*WPx,1qx0(6.Qd 3*Xk\lrB.MjUl|s*]7{훅rQGp~S~>gF?J恉%)>2("S7C^FPxqB"NԷ&w@B_ j m(IzOv}8b lI_jē/}<ȋ )c@ZY˜, s fp ^"4\@Cv09ooDRf)FMo8(&c 0#$d(gR|x|r p xt]ݠbt:hd4K)X|/ ZI*ɩE`qB1}nn+Q_pЗ5I> " i0-ߜHrc2@ Zo JqcX:}0S|>a>k\?nj7T#gjȺG^}B. 9% Zhb&ܶ&H*kj9MLn{{AY5 BY!ebʗ_pU*\[b4ᯘN{&o+(m^F:eqK=l_3Iui z)7k obǢbUսMVЁ[ZwX61k&"0BwFm/Xs |O0c{+-w `ְK8Y#v2 PX#!J88SUWGCJ9ō8`e*h%{) Ne5r։*q;\LU,1>Zv.V|J^.*0UJy%WK_UR}yj?KJ$ _Ko꺺*;}UVpzt#o#LEk)\wO#ɫvv$W?~xN*,K4/nT8arQR&J/ Rr^ gϿR3{uZ釯~8]] Eűٴ0ăg&`+FqfJץa,?yLj4w&aQ)t>nnǦv{\kv|ˀxFj ^D x"WF,1R3ҦՁ(EN)[&z#u[9wu[k=Ӕ|vaYKRQ(q)I VK.FjeV ZaS8 c W{pu;"E RrHd 3âҌ;dp&{PKi_O H=Fc7}R"ȟ&ڪ"OUe(bdVPf-C"һAw뙢vp+&׽0yl(SW ZX.* 03X?]ٿOV N^ ZtM6UcAȬ"]>]jY;=op,5*ЊFħz%_~g?{~~|s;9&og߽]/0RF$r~Zk%v Fך9zW9~dfھQ97Ccp)nn\^V;] }_ɪP\DIߗ˟%(!SZ F1ujfiD~| ?(YLVVbMJ.u[H18M8̥VjYJOdqb^^ϟhc;&WQ ZTNР)QgT#jCXe6\ Rw;\T@Sk"Jkv4pJLT:\%*9RtDph*+ıUVCD\8wW/Dvh L \%j%=tJTΥsc՜m+&Ӛ4;Օvt??>zC6MW=f*W2 A(%i ۏd., L'j:t*9LDVT{qU4PYaގ] 93F܊{0Q=KIG2F_ 7Q1bV߭rc9L0'A|˸O4.Xb41AGy ▱ 3EZ^1EFX:ug)<5 CV\˪fC˔ۮShSKcgmI 9Zr;AfṈEB8gc`bҹVâsb] j{+!ٶw붎Xtn5'HZiY8U s%ֳzs 1T)b+t@Hm ~,[6Lrsej8:lpO?S<]?Oson1Ю[A׶f 5dP-#qshy[11DZz"]ԅc>&\@LDy*3r+ͱ4R.:ej?eF MdyheF=يc*: {Ĵ(;ȹ kAڹR ޵q$v~? = xeZf$ IQ_!R% LW2#*f]HmR KFJ 8%f"s11z 67Rsro.>lCv7p wSWO6;ӧ+!'}H !U.fN% *e=Wɱɺ:!(xP*pu>202VTjl{q+p82,XKAkl`Wt3YGX0ꤋ9X%b3 6F2 APQhX}ҡ*vZ)mR}g2.)zXŌ[t[,L_`)01w 7mLڙ@%/mJϨmֺgYV{RO/GiZ˹UYYId@umeY^`[Fc"Xec&ɤxNO*!t\,Sqrb9˹^ATzkxl:~<*k,M:HV=Գ6_RB/&̭P's+T:%s ig߯F]k8`R#l$܀ `\`Uȹ39.@ jpҁ. 16NI. <@x|U-" c.8SP\ABE.2Kx ^Zgc_Cd`2i#9W@{sy<`A{.ߌ'xb}o2*{IgXڛ;{;0a{\B6w]z=s#"\L%uC/it[:]VgZ6Բqn/zm󙱲CK-7C P_gޅ mv\y_,6 K~&tdA fSJzYgMΊ3r>F|6:}lp,nH1ϚO ADQ+A'XYP]Pq6#8\q͓Ɋϋg+*uYYsۆJ{RS+oɐ$Sl# !X€(qpU2dJ2hVW3gsJ9Vgs)K.H`ti"f0׊fFѴհJ5]X3  } WNs&WI3fgYiχ+C) cHd(Egxt*dH92L RZ@f >kwƖNfEP=HEmJcQdO6mǜByT䈙E*kjla8HK&hjq Z[Nkwvc 8_b< LfRBecr9g6\LȐ+DK&!Fpp$E dT9嫑]F<^QC5"y;rSD'ZUr^ .d2B* o"mO,$`jD8c*x-l,* )4A[.F"rKleX'WH/KYKee(:ōǵ\GB&"&E &dmR (}2XӋЋqǡك "_&'7˫{?K -1aDhµ=(eN9sĔiNw܏`ߑ#;G@17Y 1Yd %wB,3Qt. |Ń yH|}*NC"5W j$Sek\Rª#(S_L]Sk2xh._g_cJ;KEj&m`sSIIGX7X?kYf$ VG Y,+pp~3d9vT6:u{VύQD2R>bjL+ʎ1? =V:5lCx۽ؿDw]~'| w?ߟ| :`,e0֒MϏ">/?>4yiğ; EͧM.㺚f/ŇŸAۏ )p7m<7 lOʡ9=rK$,q/κh]JU"*xX/]93L/mU4JʈP \*S-@Z0##@^!睡;s״!x6* l֌k 3!$zyvB0L[p=h# XT9Pă+E!%ZҠbJ ,й6jAKQ){fP7\ր(=`bY@2ll";bd?,*jlGw!ki7cwl:1⢳Go-:_[s"Ag>d7R\-(Chhl{':u 1 H<O ē ZcQI0]4X9:KXzr '[Dܪ,MLIY,q"{>Y+J& mZaiXU#g;"8^yp5]/Glk{%!OhE } )_j籏RHqk䢥0IQz#Vo[{۩y3{.1t,Vrѽ$ttդZvzH{:ɋiߙ-VVV7̴s8~]{*ۛvvv6o|MZw&Wx|?~ߛ.iW^&ףw=dn{7ëтbH 'Ǜ{JQ};J]R~*8k %2k"dU6r#"Srb|&]p| Y<=bw˖_Zez,{T)o+->J( Q JYrخZ'ibeJ$2q@f"OWg z M>Z,lEyvߝ?'4_u>ѻS_:ُ/oDQh72e). 4?e>St1F!o9zC=P5P'8!i-1͑yl<{z}ov:w .hqx:iɥUHJl)|8&ϼb  D2-1hլ[c -lvWQ80MGڑY. X#j8bIͲt|n\# sd2$*Һēŕ MIV&"KrtwP#!tmQkТLsKq%D% Xc j{cBa8Mao'b~ZՏRL'PXd/[ s=Lc+poZN@z8?XCȏi}~'|0mU=VV~vίGg秞l}Fխ-[MǽZOM{OJݞ(Xz5 Ui)_䘞]~qϞܩ'.jD/$|'b;tKF}504xW6kWQjF~36/]%7ΏoJtn^E+dL8kd]nKOKf{mK ~NAc毇iz7_ {}~=-wn4jɾ+O⣟ owxoQ;\~vuVwwO 5U[H {Ζӹ~~h޹]ohHgr<݆ܤ`CcRyPYpI@9r){;nrgX,A>kvr˘y;])=O|fJzB+^ԟ]wl݌8e&l(YЙBkvGN݌ZNQ# =&Mo6lxn6kY ZŨsDdζDV I"pxɰl5M2maLWY\Xqx{ŝ**Ti}簉d E%@g_VIJY'#9$6N+9(ja}O',o/r}ҫN;R!Or0r LKS˓7e!9?:Q[ol7t fN~x&* 8p* {TObK})YS;Bkk(1DBst P ̜ggmd҆ee̎LuC7:)nBC5Vr1ϑeya(BPsay N`A)PeMuWMK#S1<ː)Q uBeUMvm68o/z, \_s%k~8vV:4~>x>Pg㈎^)'}6tg~:GL* $NȨ)ehw0榬Q(eGdd5.O*xSss4@{nۛB}hQ:y4Mn(iVaCeiO.2c^ь]#~|穡h1JNVGdkic$^6+ۨdX}ҡ؁V* zlAG~NvzAg#\(\(Dmg<R5I2cuk'eaآIY0F0F3O^=tc-#mS={dD`k^G%pM2=У~9k32v(;&d-US?BJi'MKIvdI6P廳v;$ߏnz Qrxᗧź;:;f`hWPx5mҡovQo~O&orU^ ]y.9hrd8XS%רy "FIZ5@xM~Jy=fFӥԓG-_o1757xGdཾx5{et L^1>z)LL)\7| ` ?S#֍2M0SBoeڌ`j1n8e'GdrLG0JNe%`DEJaey}҆n#_Pokm*']m.OnLjyj8KJ.(b@c 1+tAڒGFYW7XVyg*8Otu:z. H09yn:ɨrdWg?ހ_,yثqwӆM1Ecn+"K&W,,4'7]YS#d#r+mC (t43RJs: Gw<下wTpaPNIBp.&of̑s#"Jia nWEFN@G).(9)0#=@-O(xm YK@2vԵ~"t lN4^NJ/Q&ˡwj+<,T豄",Qz1gJh3.R,޶} t08J:pC2hLP:PPjwe/mi0nK:(Ү;39) yʍ5AdEJdz7bun56V;HQfPRq*Ld9sDPqs2u6׸7P;̺4<'Kӵ>u?.+bͭRR?rf3o+YÌt4|[Jp6ooRv\a*Кa ;01`5(<稕)#g1Yu˦S2: `0(.I BtL\dl(;$0co^x~yu6Zy61zy{wخgr::À[vOΝ[혥4f »8l gG9!7\wͦ6-Ġ1j9ή@wO^֭v}M7'h"yP~[pt"7t9O Wxp7I\W1ܥlФlX%;iEnclB P3@OGn#26_~ꍬ{?6' 0~<9%k&K?ǰttK3}/~Jm6((CڹF@`t Bky-~ւ,]+| N1Mҁ7 h`& c4%ԡ>t^OwicL,\(THIȝ+LH9esLwNw ʂ}E Q}쁴ͤLf [i:3nrig9{nsh{ƥ`<ػ%O2SMGPАs"Ke<]*hQ\ݶ P=mJcQ(¡1pY(39̢mkcBCմ@v?G;D d10d6`26$+T6<"xW@ʁ>0kC("dB\Qek0hD GQ5HAu6]5q6/S/0x(~?{ƍ!\pk\{ ZM{<\`𫭳,} wW${eY^rIZrggvՈeH;ič <DN(BfxQH%L #.TUjD)! 1Gg\pM{@f5r׈"jzq:;$_gk\^-EN/n1TXW#^1")~a/x&SAb*gj8 J]񯰂8'a7kGڢJvmN;nCi-m?<}W /mLvTs.nVw~xWD s%LQsxpxȔϧ/Zt+ fk,m;s=PMO4,i[*THI[P%E!,~ ) xNs"+}{Vˀ=8:wE^a(O0e^z||LRS;f*zo:W!괒89vW^k]9ᵬh/q}P΢욡:HHd4Q9BFtࠉ4"Jq_2(9rH@SaEFHo4:F(hoqǪf$aNjxnaݪؽf (BHQD=^)NjgM[A\O Q? w%d@( F_1FBp͝V%bBІZRoy'H .g0,9q[J B*"TI$ѐcg` bo_.y\(=bL(=.e*W  l8>q1IDM&>0CQ2ib_կW].']XyژD|5Aj4s219:9N 3TGq`o OP0r<&@p 'xxz(ˤO:9>׽-IJo?x&po ԄB6&MoWl(U%ԄO9խ{UfKwՓ僷w61f/aOϪfmEj.gewgvXA4ěG.aofY> 8: W0bG|7c/MӢqTF6:dӨMs5T1u:\F篣Ir+Gި: rlT!OZuV{?>Cv/x~y'wyw[u+0>j$<E<[CϻZC#n1amκ^mmNa⃛bҠZ z?\vO:k O>t5*gЕ0+H6E؟|EI_שȟ ш!vM~2kCflclVJ&24_(F;"3IQ:Hl)z$w'Ni`N$T0ƣ&ApC%PEcb8OQt;Lr|('[㉝.N6/N.)C;_FI/ .ybL|?g77-߹ YL FLk^K\cIS8a7)P]zmGϬk%.),6WQYym"*+SV$5V^K=5Dif] H!6&N") O5Z|OmGFh] R LhƉSe6z(Grų[}M>}$N}V^W?~kqLRH]q(Dp"[F=AA{2[9Vw;i]ħ2(co,@Qins]w=vv:q2:,sZ-Z;4xj 3e|IfP~_ӽ"S{Ik/4߃!OYmk(b,L,/`"'hs쩖9d;ѷלdu.|{{-_*xjgseXe^3.0(.3VX>[;V`+7Cc| n9Wm)}Xfr{ ݮZ qn)Dh0`!tN#2/l)N1]H]tSE?E-t|"Ӱ:7+5'eUf_XSbte +vJ5{+znu8P{Ǚ'6^n3Cn1fb1F\:׏:7՚(iLQQ}Cצˈ9O{hb. exr5ͷ\&Z鷽=}WqSBɞ5hn6%*i:L!py@Q5\xol86ƓU v L Du Ta^pOg&l'uY9N+6Ӳ ͽev=K>LDNɾ+r|9o2d<%Ȗ'HƝP69u,*@ȝ;gy+g d_^pgK-Di}h0!/ZcERʮ7^+J/sOAn7vJz/:E\x`/†1kRNs;A!2hE_$W'y1O L|9ѤV| ?ߠ&o^e{;xp)9b4~|Jz9#W'\ߟ>|8j5/<ܰ(%epލ*r^!g ^8s?((]&gwIht5Csjy,lUZu~S^Y?0Qq3 }Wy*.\Cjmk(-uG=r 5*5Wyīƴ_ĦEUVlr}\n% [Hy )Q(B $+dDrm\tp]aK+48;~:hE)ȿ(epHr]N541Q ZMjG#u!xQ:e$PVH2qbPkUovA7BN^ml^0U:eәmlKʎHcA!+kf_Hy:KmX2 c. pQ){)PlWay+LxE"JXIըQ\biE#cBr)D 1PKwygLPDI`"NZkl֋}W~…CoJӊ&5L5;t # =#/@%qg02IL")"M<:ddDw۬҃s+Hog=E3y;AD_ᮀur{7GE+B] Fygq؛ac~!2V[6.rW__aCG?pl|i76>ܩ{06LZ,YŪ=hki]\7i=b@ni:,EYx 𹍚9$)(XUxa>dS}GpKt'Z1 LJP-!)#)8d z`T t)wXЇ{DovmMNp1Syg睰s|t DgpMG )@)(fO]Ks#Gr+XFYU|صb$5 @ G wg5$HEN8-Ō t:+̬|D tpqRdzh⠟/RF+Tj'F ~'?Ph4b]J[yj0~c׳?~ZTHYIoÊ='VK:Y*jKIKhЧg3spd^΃x9NOܜFcQɢNy-6שۼ'=Y/Xt dͣ맒ѽU;|.8s+:@hӨO :\*aF@:q]*z|νqy'ǁ Aw8} C k0MN9;/Z|UuJ.%, 3pRK"y_huQrYDQ㿃|^+uݴZ|ٮcEr6}t}w}uccrm;v1x+MVfȼPܸnX>'m]v z[f;{ǭ>݃w{rv~([^n =y' i7?nwևܭ+~j5_޺;km[ieU{޴>-2iW-6?m.\==mdtvXgfX>+.xMh`11z4VwFMvN$PI8 L,|ʧ9prp)5L<2)&{m}dRkJ ϼ:j|5j0eSc q̥ov=pFXu0茶Pby#I@H |ɉCRD "C$ "giDji>;}|n<5]wf&|`;ߩ<;ཾiϾ7U۰ȳJʻkH"S6鬳:&l.t8!stht1{J4䂠'Zl.ftȑt,62V#g72Uaa5x%ƒbӜmьIZ\=2 ]}/qCFک-b{Y]21Jye &"A/2Z.m@8s2\Lz/()QTUFɶcflRsHmjwEnq\11Oqljv7^ڽ+!9hKe@`63ѠR}Be!gУIgW3 iFWd!fYA&MX (0Iq$/" dT'jWOFnM;rwwW̐Υ$hUUֆăAƐU s\֢x*Lmʲ`]DG6&rcJ0^gF!h(xf#`No&e`OFO%v7:Wܠ+2'a6M [*XKCggٹlE.i?Mb]#@EefeNK7.c㢔7 u&9P222ƥNK.xP X~|oYFF&mx8zw܁UFv6= ϥ'WSET{r};=UJZjzëy95.+Xc*Xl1Q;/o6*}Mt:\y̓gV?gdRw%jI1%E0A.4pњr J@.aOIyu{`e=8?6VdfoMu}R`2deQp`1dPC/UރðM{T!v~`UH]N#7J{ݐTݮѯVN]52ǡɱFoI2y䪷gIƞxW{_6WfWݱ;ia'=4egiӋhڎt:H sM\> Lr}|I?Eor&yBN}!Mz2t)+]]xmzYzune|A5K\ $q7(8J6ZtIR VZl1}vG B厍{`f6kyDkHW<)GmTޓ@Jd1ej߮#m`=oC%&.h.>(ícwέO٪d3!N=i,{l0nd YeLCF!!D1 d {(s)RCEH t#]r9o1j !9q $ZEbbM]4(-i$Ѡ!$:\@@F{&'zBB cbP2)E-+@YI_EA$|Jw1B kpLH$kLdl|ePi'3 ӈ/ 0M x{NS%>6~nG"nM4ۤvv%Qx6m 4]Oh}l޶ܓ=iC>q\>PUπ=a { [n˛RuJY*1s7` f~W6 ]koF+Ŗ/Ed.f3tƖ\4=d"[(1r(3g9s.Ve+Z`We{ռFV՘˲]/W:n\=:5>gxMS:-YRwv:?O_>]=>n<] *N-kfN//C@ίV< z\ SUOU -cAln6'g3o&gyt.q~pːztVK~Lm߈'ϬZ&*y剆{`k[JCG]X3kͺ֚c .$eccËBZ%N,H̅(#ԱH%|sR҉|3/!y utUu3md( ,μޓ\0#; *;'-bEvMtF!B @z8Kg/;[kn8uQκ~Zaʅ>מ6k' {VyttI@g՗YG 0)Y*R#IuA,YwC`vw0b[ P^ K,ƒEpNY 5({`sHNH0A ƌyȘԘ8)S*c\:w91 * E"ngpVq/y+¿j5b4?~Znxkg>-`>M!fW$#NYtM9.oGEee2K"L I9Ś*4E%S[y>S&r:s.¼7AC! ˨*y \:h&yX0^L\9 `buvSW%r7Y "e$oK-tƭ:(`RŒ#i)ARa=_pPW>gZ50-r]hCA,o&(.qh/ ,QO }oU5eByJ3!́PCAsJ)q8 3 ә͉%jX{YCyݾ2M&F[e aҭ Y ԛLze;LFg FZ^; ,+pkL`:WT{NX wyEgR~m~v[?aqtĘҢ_4q7s__h9)E,\'eQPL|[60@&3K]:zhrRڞ AlHXyS~'M! Sz;Iq>>X-c0R|D>9;M{&/!Z[.5Ö7flD&Ӓ`PO'_=ms|1~sF,m[wrYjbl o4|s|273"A :ɴxcREx5zxR}ˏwO^y_o}swӓON߽No`Ch)1_679yiho4^J./iCfnЎZn ?$?Ŀr2+oԬ^eW**;%y_—b@XO,ȥAOt^WqT:8H2/[&9oR9 Vƕ(0缴>G[#ϼ)PiB%- cz- ՙqfR1A\Nah;\trӫZwX>6VyҽN#rô&+% :g)i"fCrKȧWe:*Im+'pTrQG^aʖ7MC~ӫ/2nߦ| JW[F;w ~wELP%Vr`r8 -]+gkA?#Mt!+>{l֏Yluy1,ύlM@r,f~'u /ˣ;^~n^c t]G\K/]qPqJ[8;Ò_H-YNu±-H9wȬ) 3! m"-x">aQjpBZ|kMI׶)Z@kw7c:׏14'׏6-ܗBr>3'?Ir NJk(I~dA}dytbӴ _ +]z̒c&UHE#M Z.M ʦu&FYv@UCr΀Bovp9LHlbZWR%*mR%r*PԠ`,.+D`C+C21f`-⑮}+@i(b˘ 3]!\M* D{ﰀ(ꫡ+4v3]KoDۡU{mzAW]֝zJ-2 0W2B;M]Zx Q =c3"BGCW ]!Z{ Q8IDFCWDCWKW$]!`++Ky4thioNWwá+!dZѳ@D3v,̈9Yl`~NAWۉ \å^iu\eQ+|o {VuImB"Z47e2X&2$5It؜YEFf4 Tȉu&]2 ( :alF!["v:4B7#ÚW`&5$ še9J5BX "alk~y>nEME5l&<Q 2wQ Q-yv"inw]`NBx#ZNW!ҕLR]!`+L,th;]!Jk:@BrtM!ҕP8M)nrLpM), :io#@ _qV=`W3jׯFXq߫}ˎF8kjbS8UuZͲ,Xl#G#YD:dz=]lDZ*{obBfQgSp*T*Nt38U.uPC4Wʌ5LA/N%T/lj*"Q ?o;"!Z{_+Di 9W;)]`D4tpfyt7l<&P,D5"kp2ϏO:S`?vv/..Jڤ|>6\@JmooѸt\ Xopmm^?~VN|~>y/0{g>/ gSp/M}>;[̔qvYFjl>}O_Ȫw܋ߖ/>-,O@ED21O3p@ϋ}e]̔ O3;4>G^B=.w?^8tֵ\r6irlI[\1qoL>k%*7S{N %csw{_o4^ F7_ݴ6m__NW7~&{Tk{m19k`mQR5V켾7\|9DSͨs t!bJU..np`%ۃ>~ƎxiF:R_ۍ/кuuXC@.UV ڜ,:DOQrs gZ(\ #4Qcdzs&E}OE׎XcKHXݹR6>tTUwPR]#R=FIAڞKAa2c֞И7&{PtlE5䔔5W@x$ lnMM&fJm W69e6&]J'M9w Fa?ɏ@9Ƭr&9{H6xٻ0Zq#<=D#@#.hI"MV8Od3<%cNs6 ~ yΛh̪BAܭ5zps!D )'pp_^w(yN>,\G[$B$J-8a##k$3$ H/*M(- V\jH) AZU3kflk6b s 5j'b>'3k^,VՐ jՕo* {꩛Pc8%!(R>IgIQH!Q!S#=ԆR3uiJ2U(P hs!#(<ݫ  0O:7mZq6*f VTbJ|d.φJ5!12ї<q+Ygeеmbk.loeӸ0`e@` HJƬG6$̡  "pajGU((BhIaSO)8/H1rlG?]eU#RɂRsd-@f+eBCl=HKqN +32R}S WP^q53j,7p6L&aH6*zbT-r3.n%WgA e :oQҜ` ePDdBE@[vo+Z̥댺5g@x`1g#~ nQ?%l(%J u,CB:CAΛ:2< pXi TtgvJAQ"Ł.0͑&A(;kZ@Q 2#}Wm (ĩ40n[]TIr YUDI)bX?32ˍsTSnkM>Bj2)%sMȲFlGҰZέ>#hҴ{#aPn-=C/fĥ*+fF$'7**6ʐvND( /}4`rqwi7qYsχ?ԂUЍ3?8˶m900 ƛAy@"TYU& Js2 RGU!ˈc*JHv9!Luŗ,J茸ޢf7\mՍ.< 6`O|m__t=_—bM{{~uGe50 &yhLi1 l\&{PpJvHu5tk*zkMIP'B)@kBm1Fn(g = A4̈=X5n(!/[øb樇 tyB1C[ŹC̬Y0MA;h2 RČ,@z|B-rnl7kŰl+TO݅+IēRE4$\󓅜f?E^Q^1 b0۔ GYTPcD467PGb$ۃpY/xp`\SMFe]8FdМ&MܬRw jYTAljPIqϤJVCLP2B9{ tmr 4ݳנ׮9q5o:5^ A8ڡ*g`*0hٚ¥f@ 2q_HmfhJ7acIhSLEiJr,Cɵ+FjЭuA< \T*N Yc6ՔWnK9`e. ;f@jrX BE!&MSJ,3wR@Ւ0[ Y\ yFގ+Vgy`AA_5B_}\6tQYy=uVMq~Xﭩ|?{5HHzu*ͭ^*F_m/o;Vo|b˝-[]rw r/.b{o5x>rsy0q i>lWfs}{5J"Dq 6$u;δyKZV?;?] m[߷tq6nzYm0~i xOװ(^k g,+\`j$r(of1N Q @ $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@zN2iQ{Zp'P7fN 6' rű@@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N hANع8\h:u'PF @ܸ(N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@' ̒@\=N ш%:% $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@z9Nϗe]Z~KMGvn\]nѨWrKkc\ZoO޸ĸK =HnAth1t5[ ] Ɯ:] bVUD1DWpX-|\֨xt5Pj-t+~N=]UPߘK-]6oCWGľł+zГW_KWY ] ˡֻS + ^X]鱊/{S h=SXHŴjp9.ɫ ]D6:Q][,{}( 1{ӽu&5&֑hR̙3~˟_\mvVg&ZZQȾ"mjG6zA4 QbhzuQ-ңKi6&~w-/neW?ocjQD e}ڭXgZ{_fM|? Hz0Fa>meIdңlKdZVǮbs/Y$`2F7WtwӋ5wܧ25q{glEes[ӇK] |ۛ~|ߢV*=;KIQA)3ZXLXz22U`βU\!sUZE>&w Brr  l*"Jh?P֦C::Jq]%:up1CUB+ZH(J)3{L8φ\rVɶUB;:Ei7f`όP+i.tǮ%E]+cs_g`h?tCj?e=Jwtڦ)]`P6tl*tP2 h*EFt.UBZJ::AZ ]rgl*(Jh9 P*$;:AbʏN\#%bD$4N=⾹/߄٤礪7@bٛJP#îCaQhťBqXtW{ݳw֤bM.\kY .|TDZDY<㑾N5G^b BsBV&A|'ҙ>x,Eڎ`i^H6X[Zڵ?9W%Zh@ F/pJÃ&ST:y%4-DC v-ԝv>40$#W\*խNt P?d+0[sp/l7a4ֻdzgWӏg?:~ vt ǬA]O2tN?m׿!wWL2z%sUu~~t;?zWa6{ *eod n{VO;C:X2LCDrKs*y2(㏋KZ2Ԡe~>_NeڀN#.A,В{ %T;A0V-4f?:gCW -m=])LJcp-sRWX ]%TBW -gmRtqڭ:GX qtdGVW{UW{=P=6CU4=jEPFt+ ]\x.tRvJ(*iN `[St ]%tPN]"]Q"0>T{8UBIIGW'HW"sRW ]% BWUBIXGWHW ڳKΉ\p٦Ϛ)pmy06}{é0SVzy,Q FMZyDBI;5qjF!.pFtˇ\AsV> (%F] ]IѕdK]%2u֫+@YNҷ('JE> "J;:ALr3+,(φ\ ]%3aRpBWxǦXQq| 0#vK-?hT AW6=T=}Y~p΅<ֲPҎNЌ*:\*=thtPJ UVgDWT`󡫴-жU:鎮N'}S$\),j*=-QF4k M\T.4*vˎOeoYzs޶{^0MOh|~v*,̔7C4keJ:2,=(#i]/[]SC3Ou'J&]M\olfzg?LCȍ;v ht?׆pq5*Cz Y@˼c.*5X@UvYp7R%aեgX!\ksLW0峱?O׬< ֶ/ס6tS9lxRð7tIᓃ =~*9xj_aTGpCS'vVWe8;6z*-Zѓu~oTe.l*pcr:nFԫcͶyh 8/ʙ `GO")@@=/aAݦWcteegyԒ ?ξh2 ¤g[Ի>U!(\]>YFV(aYWuP?6H%exz|)Iuc5tҭEuZhrf: )h &*PS#i\tbj0ٰ ˞Pu/EYMᓯYc4߱N-ЯCQSB{ŀwyGWc }J!TE N&x^vUBG{lkt*+yY &$}pGmӺ',KyH=z5 Br/'}G`!JZʐ3MF QU؄ A2 ƨgYZ"k4F2%HSS05X!ʌ3NGsHR+dQpƃn]"lQsR&fpaoRe )\@KeV('%V*%daSDhL$x9(Tc5 UiE]}x D4P h}tBR2E=B2jSF`L>֢>6 .ڪv1 Im`PLgH`_WyőaYƳ2O[Ot~og[MGS($!='5=I>a( uEz{P .l_:Yf) HY̭ulUﳜW_|9RnIw;H2b!_}'X=BWJeNbi/:a|uw߆ݺ/jCGͅ@77r͠ ̴8ҾyJ M*T+M>#>Q섷e]v.)1s!fv3j9d6D%O5޴˾[;v 6;v_QXFRPa#h`c LVJQ2Đq^ 0;7 qfHB9U `ZR;j, q7ύ欿<G_T%&xoꖇ k&YEz@1 tQI":aaHxWJjYfJF(FJH1 N9ťotȌ ^` s`kiiƸ1x%0 ʅfsfIo/cz0uoW~0gj{8_!&w7Rku,85~xHE8!)R)IC{lؖ9꧟ꪮMG Gl<$!KdW `d0Q8dbliP@ˑ% dƶˆ,1cE{.xM1h鵌mLIZCдح0b8]Aִ#͡Gn=4n<ږ B "kr5 8[&6){ٵYibF(&B&f$YA$$ 扛8FǬ"2z&xXUǾZFD#b[-7 O@YpEtdX^9O"1<eN0lTmDDs=Qf( ,i.ɆD&C9|2"&Èx}U{"rK{iɾ(ZEb& KJ8B:$uL>,dAX]2CŮak<33@XяHƪTVp?q:O$z"%?}埴`=NɄ;O ۨ-D93B*Q*`M}xgsj۹lPx6t"<;YRWeNk,#j<DH6`3\̹A lr~vFfJ3%#60s>m0dՖ'E-r~[Pc.`T/L̕۲,`oogbBAc cn7 Lh(}/4ʤ\TzY/QUL+YIG bOИ/n|ɣ"֮>UikOy3#@9z_E2Ld ;N!8>G*#S"ElDsthy-XyeroO399ZZt>DAFhRPiPh5&] 8# 3QX"'9m::qRJ(f4l${`)Xr4ႨA?S;т'.o NȘR@2 Ixe7$@zK%,^_ER7]+uC0h[\"j(8F#h$&XpHN}Ы ayR&0WBڳfcj fYXpWj[QL{ϕƎ8(k0b9f#@\S,;"'i6\׾ՔE"$|{`D>d}S`7rweq{4b-?gO<1Y*!clNA)& j!Gn:Mg~h:#<;J#[ͯ-5ܡ$T&^a4~q%.ō z3x|hQϊ˧ӄ&t8];uǟd~߼޼TRs^rex6.74Ia9%*Uk14mFo]Ksw0H9?<6V؛gQD6rVgwˇ;J$枤MݠͪxRc=J>>/pmpFn^릱Z9_:-#cEE(M_N #64.;8VհQM:[p\)_}o??2n>ۻ|K_ R(e06`,/O"ϻwmt[]SI?w6G=k~NZd0,z aoᴩe?tj6~TAnqY4¼ZߖԱ_Jb !8~D`ӾplP6ViM< ^ǹ %τ#%͚B)X!kx@ޣE+c; ̫g:<&V 8p.e9( Cg|Ni.Fsmiű3MjM':,,a ?ŽeT`:c^'IgSX糲77܈ s96"P91|d^>)EiWhtOGO Mڏ1Y –Nh%ELک\Rk@ŬGkbұ,hÝo7RpAX"d<1lkK,D95Ƅ2?D-yF.RQC:jJyC-ErO.E3,Xks"SW>JYe{?.ݷQ5+)墒ɉ bo!= z /EK b$r2Fer>K9g͓"TC[ 2muFwno=>u0YĔ!2Zie6"YE "Nw?71#3T$,̐,CƃQ3o}YOJs8'@$TLr EU}Gy@WKjJL>;$ռd,䜭L6X'ld_U1ICY rŒ!KeA{ML<bv{==|ic@dLc }Ⱦdֈ*IIQW,hogO)t݅QNy8~e8|C)`e$%F2߅Җ>ALcr$DK$mDgD:Ф':0gwmBQmnn,Bė27m.TRIbø >FddIȥץt#GK`Fve:qmET(I0x߫t[i-59 `؇G)4bqŎ5L~}R\|obr68N/Qݼ[('eqǗll_rRIo.&r?pz]sn3G-65Zq3:7v6sDc?S'OnLTaTۚGPdwJzlp9S7L 4ȶzt% q6 ug+7мfzqą7?~`S~i2ov)wj> >VC+uPMCsRV'vP m>Ԃa 0 Zؤxo[lSb'Im-zF{VcK*5{k7<`PՍ]D/ ]=ލn)]tᦧ?M&#'mϟ7o9~qZnlKZ,SU6Erímm|#](jYRh"Ԇ4a^?w3mL&9=rt0M,rZE#rb֣K풊(gICFEW^w]ZSF]_ 2dCdPy1qob&Go2;7<% v_EZd_DD>˯T"űgGĎš;#&_+IUDC̨*!8RUϸ{ffG]ңݫHo퓴=Şq=w2NY?q#>r41OXcOѤqp}ƃWVkZbG!(H߽7zZd-<~{\ԸxB ;*Q. F-{W[ZKZjlke]#\w۞iBkPzcV|][_Pi5z"%>oWn%? FWE\ WEZ˻W$%3}/̫qp(i%ׁI)X?xWzf,SpઈPHkpUZ\Q \qI WEJ{ JX< "QUWC"<\){vU•T\ %ߺz<݊'6#NN.(?~ ~ .&HOsD/;(ߑxuٻ6n-Wm`7Io4۠NmÉu#KFr{clKaKM:<ȏ9$Ϩۛ[QBz2BCQP(74|o{[!`O[ ŶpdcD6zc^i@ P:cSU!__xyT~M72F 뿾yR̴ҸN5b6ބ+j>Tm{oPbJo BѕUmÕY D]yֆ0VЕGKӕGQCWGHWtt:طjtn~TH_uv?k?$ߐxc&kçi&YbT aKAND$~_DEETͻA,;A<:䯢[Q׽BUt70zxiV?dˣϐ4zɳWJ")TH,X#) )Á,/PuDI.ރx5ҟe~)iue/1P.f_gQ'|.-7K帔%Gg`z';Q:^bfguq4@)>u_xQh QnkI ,֩& #1(,JÝ&xEp=.\0,DU꼗 WL|+&(}nT  +j?i6>'S?oۨ++<*kxEfVJbRLWyoG!ΗcE%1.k̍.Ë6>`i7̪lkm=n)qxR|!A$SLutT3縼qTq,mbTTPz|ٲtHbni}nAN5 Ʊ5-{KAoӆR.NSU%-o8caA_ɺ[Gg̴}ځ{ۉ~ߕtr6t4tŹ99<J; bW&\݊9I^y9WfO Pv?١|M V{O}v1T:zh-7kR>JѕMldRn:yNBT)01"PpcR3Uw tN\;Jt\9f>s6˜ JB}(hM0ME00lm.thBR$ViV|4V+Wƻh:t 5HWK4 vwEWa^hoyd(J ]`]y6I;cETKI4nnE`uZ &5 kZP-ku´:~c^Vͷ3<Q6- e/Wt+' xd*,JtQ'3'sAFjaSg G)LA-Ps:>Q]k3h>Guci 0Nt]\]( :1@I8kJbE0vps=:]y7tutLT <\EBW( iQZN'Cw FC3ʽZ/^̨ A ~h9,87R d03(fį"2Ē"$qY{Ѡs6nAu[9y`ƽ]O0O3ӽ;+7ھ8HsR٠4_<+2cw5ibJ9S,,\0NqMa2&J+"[Rk5J_$h9#7Djg'Q@n0TN^l ́K ^Byk]ɠA fw•I7 H3M,fa@]b--"AJ?s(yQҜkX~'٭;ryYG7MλAx (\ {%^O>"@4)tѤ TW:ﷵ+쳺~:tZ]HxfV %RV!O u! I\ Jh ɏn[qi&mB Np$!Gy&ql8uiMubH &XAv/\Хũj0b-9 _ER`ߜÄ2 c4Qڲ(J#\KLDJTSjxR篂7օuƠKy݆;h?ރl3:[\r ]`&`;v=䋸|ݎ4֪‡֭(zۻ[nh&`|E~zߘdb`qL.qP4iVt"0\^a*R!ű/A2*/:\moÔ7C0Of*3*RKpDY&dVfRafX4V+ CF30܏ypucP(POi]irH{VQ$cec)phiH@3·mFQjLS*| 3T&Z!*ʍLf`kh3v;Fݾ}(nd `lLdDZew0! &bڅPb6][=x3?h̝V\0n"$ GɤK"G͙JOn EY&L!uL}(]r k!<{s*2߆GݚLFPvWlY + cP>IA)e]&V>lf*aY0} IÕ|V^@|zk돋`Iq.0igw M @/fM LǗvRL@Bq/oJOA5 1vh1[q&čTN+?F74}:*L~ uw~ 7^N e. ]gBT/d[Z(U뿝x":0p{!&&ƗtIWVSV3G%{3A:}Sle_묑j]VEj͝l9Q_Yt1۸#e< BįEa~q;w:;}yg瘨w `Z+!! ;UmcU YjMvԫv>`d mmk@~N짯F&lG w<](a_O&iQQJ^B9օ9s A<ǸwtӛX2O 'J'JCOLV,5LJ!1)ϜՄӄLD Y’Djø(=Ì0|<5 y7kkDJSE bc!o[0iʹBw4딙owc6Ǧr>g>E$Y-bAI((azu$]KϣԍKC֎5so$LŰqct]&UNY)u˖I|$4TsS1(q&gD ɱ6L'Ve`5d˘a,ƥ(<[aL0p 8Z)e33rhFv m:7k+to4s8ˋw}qC?˝7-^_ʷW#Jiar6r}3OUXX;ryYtzn,7Gc-GTvҫw-H}Y |38@ǽe} [\vLlI2e\B%$_3TA)k% qѨeMѷ,}JNG H||o,b>dbوL:aXnI/q|NDDeSfȱV H*Rh4a2,pV6nD2n/s]%֟t)s#Q;b3,n=׃o>Lŷt- ۥaaWm\ٕ }Z2sY Վ !$0VgػM{\p ࣴ%۪6[ߜH <a8\.0{.e# Un e#~²N[tus>k~٧`~9F#xG~ *gX#̞A}K<-YnFYK#tL2畇9UUJ( Q RJYrGJ\rW"y2xҧ]ee;{ m J=;Luخ0Ϝqۃvo]ğy-=`-v)3E FKhI[)lZ 59R+pb;S;pD(k[ƃ`)C  -  'Q #s+C`b X/7tDjZA#8Mwd9;\6).@PfI3n 1+ 4A<8g˒)9+o,8̾4d_pִ埌eJmm-{vέ}|c@34qTf`qϩ_Ow;կ~]?M4ȶvv%l ]Ƀm o9݇pn]dtVrd܃sYD{z0,,6z!ԁ>s;~+q1k5&Jrp #mm-|jeՖ+[RSO]q`DG v׳ԍii3=2O=HTEN[: Ow\sC# Ay {y?^ FcpQ,R}:[&0mIUu䟶hw7lIy5[,(΅r%d1knYpyɨQۋZ7!:F9ۢ2[-dP'."I- *i"p䆪J OɢF9h"Jξ4)Ȓ NF$s.6CH3Li%l^WV`WMV[U]NzUgzUP {/z[wn(( 3QB '*`zq+YD| [c)3mCoɸ`ͬ\$ 6$;RAOMNLr Uzɘ\vnP< &ʤe/rC,^!9yRwXƑIhd qO{+?AsϼbvK -6G˫?GG 3huA2Z1^=^u`8~φ/ b}bI8(t&IO[8"$h]V{n3\90/#hmdS] a!%瑩K{eHV shĨbNn1  TjUP=ŜE}d빛SC{9M>Q/『j#':2_/历_ӒHV$$}5QIuw5R( a$MqQ(eddbl<7W[uHuz(R^yj^[oSc]|.^ҩ۵ׂkvv p>l i!-aU'/0&T'.06v8mq{հ۸ۯ P`W/fsOg'%QFkcf 4xo|#BLg<<#<⥾@Z['M/Di"ŏe<;["<3,: `>LY}# M9r?&7jϨagв~XF|"&0%SQ$8UTFCy "FIZ5@xP^ s!fk)vrM'*Ldp4 u6x^љy6:2LX[R6LW%u{f2nl0pkXjӺlA ǔ[6,zEȲe`#2lf9qVZ'h; QiwlumҺ6iĞWmd6 3v`[Cw#6׮>c|uX`RE hq!fb.hZ[QQ'Ǖ*9րUwpt\gRļ3tn'Y:=x)yĘ1 c$f'W\0'A:eJzRP+ G:X%ə/瞯R{4Yi쒗lT0*J?#xf^&وJб 2π7J)1- ?G#UutwY=țq:Is1FEpXB2GU(Q)'d:0QβC8&:.(I8)0#=@h}Bkk}YKm\Է,\46bDl2wtv^bYOY~橼^]r ,BqcuA`=3Jh!q `=)Z^a&DAhkO P*KoM.CR'Ihg@:1jwjq NNs`R 5c"I l)H`1g.+M /݄P羘 ibb^[(͂SΦ [z)@i =) }քP}pzva4ӧq?C홈&{.$A$h1G&fq@'6{G{.Rsﵟ8Y hۄ=p1h0%XZ?.~9^X0EVQ3ҙf_=7 N; Iyp&Ay('[q/w`YKŏIl:yTI[uU3+ҀJ娴k𛹪`9)О7ܨ\tpMZļci*P^Xg_|Δ!"'v ̠D㤆;ThO̵A9ƍΑ8֩;z}7IK9NHQt),כ}IHۢ}~@٧A\pܱgu ~plƛW 9n82N{qj p=Ic4F< qY`LV;ڔ|JFg p%1C(yr͜%9x'Ӂ߱;{nhOm:{jkl@I>m`*:E=TA4fOtfw[%;fwNIOn[a>LnemZ3 t _ /67 n_F7Ow}={t?͖ lhdӭn{( MTAyaPF7ӛ_ݮ y5O~ͳ  dY-\wҜO>t_zv]/c>2a!#r~_5A7?ngԒ0lٴ(Zv.-) K?aK .5?)lqQgggڌq\cp9lQ(Ec `,iNH:8Ffa?ƃxoa˙PY QΑ;%3+aݑLH9ůMcWyß hh4y]O?>$ g$n/}kx 'Y([iAB˓ڥm2+z E( AHQĦ46B+LrI9c#+aGÀq9s w.Џj+ԒpUeug]=SU]U]D6]F슜q<wn+ZXm"FZ ʱW6VY¤fJO(!HhB $?Na -#o$!eHzсF]ŀpD8$S u K9abQ?BV;]5"X#ҢF\ BHD'he<:t| 3ORENu(-7dmҔG-XzG (PTs0!'ke;ֈY#~9|xhyHθdW;֋ŢzQq C[HLEXDb{8Cj=$8"'HB\t4u1l r5:"0i5R䣶4EKbeXkHH m\F>DIS[C|:))$KB)Y7>ɗ8?;m*=rhƱ(*qRI.ۀg0&G>gɹdD$Z/b]!@ErU3F5Ò1e,JGXPQMC'ѐ j@Yɔ*ђZ' Z s۪a)nZ)I t0+I$% 59Ǽ?J>팜}-tHr7uIk^3]:}:F;;rdM>dp* z}tf_&4S_}›woǚwo~O>넏V'!r+z#W'|?㨵|κ=JE\ EM")˳<Ԁr7o6ԛ?7F~Pw]pjֻvWC??ϚEs1nhb]o{?M?.EM\.M0{Qi5~gr=-Ni5o\Q/# prcvx=z5ޠ^?^p<;{O Z+ϙ%LO`e,>c|6T>{j=φX69HdS/O"&@&2uhf=IB.Qus{wɲ,˹rceF˧'̦ƤtFJ>P  3DDh9AZZM,ۉ6 h5ZroyW 6aX_Wٷ%Ley ZGeeJD*/4 4RQ=urM&ո?9]WsQǛ?=t4D5nTSz#pFON8޻`ޔWΎ3rv>]\9^4^4B再b(98aMm ӨA[!Q&HA{Db5yh76 ZwۻCWz3FK>m\|x[=Z;-) ^+`e:nh}JPm hPcHBZuxY,Ui{ND&VR$C* !a%A 4z/iv .2ߟ.?=8Ys|GI SED&s,~53L ѢE2hDg JɣE;,j= i"]Dԃ=L ޘ^BgnnzP&߸\#U%#-Aj>sJ?>MWW9gd8 =0wAKe?9GaxxAQNKqk6Nmo1J?H%4; 8'U'ɻI2Tdب_5f29FqK_Yfm'rJ}z͸! 7H#~9W0US\3}\*+|l|gn/ D\( ʝzuZfZcu".7exEծmdtvd @faO"IK {MhtX2p::((Xχ+J9ZH`F&ǽEw8+6N",A1)34a)Eu2tZar}u(;[E/u*oƒU&kQZ*X[hǵG~OF :7cSL 7W qT:˵ D_StH):~'ߝh,w W;D=6h oM8HaR&MH F:)P)|J3.>"9*c3rIp7>jKXާ9[a]{BǗjw )nβSgW-VkN5BunQ֨bL*)*;NHeu4CTFCak'Ѥێ* yx5>1e^xk$i(O:x Bp~eO}ƈDc`ޡ 3TD"iǭW(t*`9D 1Z ZloP𳉫V-6L6d@h~6|s=T;Lqxna݆a )uBHPB2`[|NHA8kgD/kusPM&[ 447שDl4\:M>` Ɋ =Pv9^He 6:ECL*11F_r FD>Pa=rc5~n:Rą#Ɣu G{>7c9"r Kݵ5I1toj6`ƽ_:2RǞ_bJvyϽ{d)#j)۩sŽѩVSx hϟ߆(бzRGwjE<%|f5՚Et9kF0iHƏ; [݅S_4|i#5gdaJ՜#ޝ55w0#Z~^X_x=9k>l}MWxΗ ]OϦϖų] ! ~Ί|p !m3\=e4}[YfO(ZG8\x1g?Y8#ZgedEڶVKcYDf2R!e8 y1czD{Wz,ȚA?˽п:Cv}>?Ǔ:̜|'>]&pc8j%>_E~ _M~~jMԖuo3r-~->~ﭵqo8Y^} q^}7Vv{///\Z!v3:+$@r}h%}厒\X5eD][o[Ir+vQߪK ;L`UVF"eËHd)vy:t]ꪯ{r$.BNHBȁ{U8m^׫b;XtfD]H PVƣȤcqO@ |N& 5;Ŏm/_Z*o,_zpѥ|iM\j^IRYtӁJLڳl,JSVg!kEJy#3kLPԯ)PJo>V(CV )e%r9shc15xš` ="ȳډ@d})PR˄DY%2 s6)ł28JyO4J˘fγgcP)++H5qTAZ]a**ѱTz& !K^HXH AhcBS}(8Ms}RE].q},SDRY sLf"߇ S wbzfу)z@-1COHz%^)1'^'Ø?7:+:I3(x:k;bɺkB2F&l%sU*q);V&Jiq3'"yqYxa0ʪ2v̤{$<&}W#Fl1@~DPHȧVj Fr6a˅륛Oj&'<xwLl >[v=SαEGIi=Œ}~HF( +Ѹp1q09WAIۓ^d9IwLBt5M` {o#ZHvvvwhBMZv׿ӊiD&dIS cYVYĉoC'g,7z~z8ڿ$ri>1ұu:;u5|Lmy3c(k@ 3kw]gTo`gy'?L܎7S{F?0xSRi й3s bFㅨNj1 2Z w>$@z$ytEU -o̬;P+ķ. 2 6?BL(kS00Կ^'7uuL4Rip\D\ޥ45>`BzL}`t{+H}pC#v }"XU}k]qy9L[/R~ܞKv~l0|%`Pi͔8)1w5.qSev!3Ȑ\ C<@ ͡UnuT, 5hGHHFST")%0M#3,^ F\׼>x?t5v4)[(45K@2v^[RtFf S-2z , -K_| etiZ- ssf>Z.G裓Q(YYlC!:CTʸ\2h+ !sWFt%MY#qc#ʁ\M5xmx4Mm‹cG_-!HB~1#!WBȍË=f/õ1Zmh::vfy@QԙeCAro JPR(]UR~6?~G7㭞\5/ӈZ[ @:hˈS]:," e ^qPZaJ:` :z$Lkȗ󁯠Cg@#4]%[6:Fɒ:^g%k:q9^"\T: 4.u,崋?Ug=[݌lnh\([4DZIn+X!z*j},H~HJ͕Q[8"sUf"bHZ\)j4WV!(~D&{4檈+XUC7WEJl7i@byy/,K: |R>{=7KZ ~Q19&:%J3"&ɘᑫg%vX <^>kvh_>.~ϟsd=>pu9yŹO2gUVZhEJVT! ' 7~wJ~gF׋0JTWznyBNyOS;]CVj1-~scF˹J2:9t9:|: 2ҚM. Iz{2c_NVeg*|njP_)nn>}7~ϜlO/Qn׿k{Yc7)/d=K%_zV-FJk)u#n6R7RV[ע )ukj'NK)PR˄D XYZ1UTF(xx2eB0eF 蔶32K-j@(f)4P1H (m QHaLƛXioD,Nmc$~ }z=8/ycZ ssf>wy]G{]G'>ŘU6d0AEKȕ덶T21~kD]لu=<L؜7FZ:RyZY_GI8v՘[K\)zh%2XM\9sq het3˻a@G0v ),M_ ]ϽqPWݛwHh 'O*2fT\ԝ䤯Omٱ'$F7?:V錚$UO,2%Ig1)0IhqAh' $Y1FյښcYpl̒ 2=)`Č^&%ʙ5cqP$-c5qbj Vb{ ʵMnKd|^>4ͧ05Y'trtwyq3-w%)K9F&R4ƃUFMƉL/`K3d/ ъ[K`oeVt sQ0 ]1R*dvb BC,[8O[l?%y(V8=f[ k Jxd03Ji*kuX^۔=#ɿBv~? x%яjE*$%[{~3Ç$5eOY]SU]U]שCH NQI$r}CQǡ{.A-S7Ú]p^z_5K\TiG?v[F?bT␓qVy-ލШjFs(wA TxNRx<] 9KH!|r!W|@ȝp>*'-:*b`5ά1y_pLQB"+IQu_]f'g] 0_7y1T>y$@!E5BQH8# b M9cuT&-, hoȈ0KQМId$Δt yS&r?\:3R]rd}̜cMLæɗ_m}E7$R5{je6Uve(P~8.qe"\™B( G -BYVLT1E}.ԹuUh0JcsSD)P*jks%AEQȗu[$Q#`^rhi4y-E@R0Hp;=1BItDWXӋh.ƻN E&2*wRBQqrFў Hݶhv6 =PhNvhC0(?5BBP$똒>*8KIC] YP,2 @j67LggIxftr.Y%X70aa3_Q̮x<#,(M..hP phbs4@j\O.KT/+Am OZ+,,8R4 //4 4R;XXX|EDL{̓߃=t4D5jTSz#xRG=H$0%*u.n\tH}vL _h'ϐ/q۝ +Q/Rտ Knsqn)C"II$:'X#vCc:x/"2-d ^0<`',9x5+$ 6"hQ@%@lv'9j ia> Y; .mLf|p2^09_[('c8L%:#7͎4tv _`J/Tq\F:PK6^0U8%l[m 9B\syygfV#~GO S%"rp[?[.KE:6zS7*Z qV.GU!n+p>r]65i+5ޚGGsn nCG![aXαFJ5\uՏZ)ˀ5[}Me| ջlbKL; 6oMDunSX9An(M雍- uxF |F":\S U9rև'9&&:9ʡd ^-iv5HjZjk~XUX&wŚ5l\? 9gnI}߷moܗv`&kF>je#՞xz׀x6 YɓFV'%Y?d;q1;* 2*o&|ݭJo_Zeu뿷efٞ͝q߲13OBK\?I6=tAUmvRHYnؠV9-j,>*23볯t[/@kQ|ULIWP%E!~ )Jo?n@UۣJwwxݦSgcY ~h3n'm;@gz }n\&M=IS"*):m+|wnsO벆rAM\9"дD +d:0f"׆I-sx@&b`ZZ6L6d@{(8dP\fBs=FڍZ<7!kwBEJh`.( ߴ,{r/[gD,?hz3sPM&[ T47שDl4\:M` V遌˥20@$,K^p< )bR頨1jc@4 $!Pd=l_*?q1EсbߠGO1{DN`>|. @)$9>0T"G/f]pxR_Pg݄'19Ӟ4s  ; G'ZqNy8rlBF-6HY)X4EGƴK6]^qCS |Tyc>.wd8p1|.=ugp=p w w^)^^y fG(Fǭ dLKxVO+U?n=g7#Z~/[W"}hxrZ-]s!-ߝrsG0~#i7$Λah8:|@F02 wJ sq2]/6zsx>o[\3qVFϺXdӬMkT1uo4`#eiȠ/g( {ďL+;r,Ud P4 p\=)o?(3׻O޿SOsz# ((c3Fu55ηԖuzk\rüw弥A; ˳d ?~vB^?e~Ԭ _qWU^qg8ĝ+y,EAb7E EPi-*1K cF-B3 Eжb2:2wXKx>ssK[[K>yb3KYËy 7}w.FcEɨ / zwrmģgu68AAZEԮA$RMƀm3T> ӥ߯:]}]~h|KFZLCDpV͊`eʻ(IRO lmȗ)Ca=7!6& I$HQ2ɛ(Smy=DSc䍷ON-biI=6B}V^ʻVƿ A1I#uqf  ڇc vlĎFp(GoAiuIReQ) insYZѧ"?;Ϋ,30Eay'LoxկT0!W8eT $}Dhb0J:&,95(omkݩM|)ǫyUb mo^LYLZf@G㺰wSˬ _}}Oyϕ`'O(rÒ?~75͟s=q\`G|~{Y♢0Ÿ-Ѳ:z[1g.3Fk~Pp#^c"=uURm>Yi̐?~wCo}ݏf!qsd[0=rq.p*[ 7W$=Y)Wh0l*{>plrGWoTZ-j-)/ F\[.{o-|DK/ ?x4mN 4Q;#蜝MT2 kzF0fY4+9 ~0NI\E8O&vܬCtf>fⲶ:G tyJBm kX srIfۛqv+f棪y땖N4UݘD~ɿF&zrEA״=2AigޣF Yn,niz 7wg{ڋnB_h}mZ[}mBEQuk2JKYK'5)v/{߄ ~__z&^~hc).?3m5ZHkUfM\};{>>4甼Ss>h.;Ul$NղK+U{;TM3|NQ]ջs*@;;dăE\ݦN<U 2ђJ スJ9!*E5e߱ 7\@ģMYUUSlKoCn(x( RQhfz/B^GuCm#ef!Bial'w47J)3\%1l_kb}|#rzwr1Oz*)Έd5;Z5gӈjr;FKTohWVI {j*+q|s'o87Tq}(1^U ~9fVDĕL!8.uA=,Z#*hk18|mţ"_Q8V*?ٙ;Th}i}ـՃ+ib\ي'%8ۗ*nXLxu&Cp[Jz$9sw>q:|9-zg/7dvD5®FK0pwHm_l }jL|5-\!}y?Y˵6DB-MŜe1p @t 4#U!2jiJ?պs=mc\Q9?t{Q1{SoIn'UL܎Hu@UW^>GwwOo Oۮ)':Cess8]~:^kcSҸ`y`P`8 !O8 #؋BhNh{CeB!f P0R:魶&9yh)ZcT[8|Y ǣ4mjk;?vuI8gh'kZc>+:eBRPno4vPu[gbR 󰀠ALt0B*mr|SLɖR8E 3t L9 A9%q@rTCcQ:P21MCcp.x\de/ȣmBȭF(LG|O2߿'q>zx=cj6 T"$D(4dqMNR=k EUdR^ʔmQ8p0ߒBS QK5-"b<j\g -]8O\;qmrQdeM[Q}z: /;2ߜP`yp=rv֑9ب%71:Bb dZepƃA)\$1dg#9YMʃp b&ڎh0QH>&.yh$橠v18nסv`7euhiL.l$&@\TL$>hkrЃ߶(& ȌTBFfHhϣB x`iHBxQkykQ?웁T슈0";D|R BH@D:FŚ #H&XFB@PKg\pHM"DG=*i&PC**|._E{_C\Tmp\g1/ya\d.vr0*=h%ȅG DQ",$<7$G7.Oyǩ! a*T52z|v6,9 rQ`$.!KKe?n(~ٖWoRH hT]HH˫hp!ѥd"vǯvrNɖ4\Xs*EC4g5gƉ %a$vֽu쾳]\0mkuZwȹO6^Z.T3QH2%49k!G"-1Fhf(ř@ e#*GyDZ q1s9"U^A0XzS~I~ZHBX䈟 9F%BF!\THHM%C y,$HT:K6o9T ?"9ʹJ&bI"C&5tZVWcTi1%C&J!ycAklL.N(rOq|uЍV|b2|@ɕ>>h4+Φiة9V*9}5}N^_ 9\Id4A.QxYiH^HS]*~.HE'rҙ*1SSQwwKF:~AEo:BIT@A\U;D)Q)ZBArc79:^AuǤ#d~kߋL 6W+OwQؖm1wʴ2 %ݍD"JtIg )R8\לkNt͉ J! 9qJj3$M:`GjI%m ?fng.ݎ_腑O6X/Tw6/z'hSxk/m Ϯ?]5g?{֭_nX inmw6wqEO[7JSò^>D1:p83 93\[PBVrG2JٟzeHN|C%a_qgfT&ǺPwKc!D}'g`yas44|g8x<]ʀG ȥf4W3/[WnMޅ]uijQ:[LizSG${} ^Tg]/|Ƃh鑾H VK"e ȰCr.]6@ N-“X"˵ 0aHNZn3wdB$MP>'@ ,䖱(Z`9mވ,Sz;"$.h764ST6[>=lxrV&u ^Ee6{Oe͠XRlyA pFĐL{#?v `8GhzK<лK1dEP)!Jq|RRa "=_u?"K/yb-JJ@IwOX H-!h!Da.hdJDZߚ`)}En? m"j'PԶg (mOdKx #^~a͸ .Q)APH ys2K YSd!X8^7 5OYH`څT_~U0] n]{,D iLӿ++`هλԯN˭CmӇ>)_0Te<.U=/Eg-}U7'ap5ӥxj|z4 iZc0:sWi8:MN"Jiv5 2\8̇)NW65 f旿/5k վT&dDAS[r0}*uVhb{#_kvZ]*ԶljۨG\5;hSnV?njx4yנƢqemMe t|gs공%Y?Lb;nq1r;'lҊT%OZ)m 568!}SfWdM3Dr]lԩ{*NW]EC>y2 X`, 0\B & qmvF0&匇> 6IVg\謏D` 5`FڒIQ2mmB+揯Pma<71csۂ~8 ZzzLJ /#^}!jى`pkV6;Q̽6Qakcv{Z^o.x;-W $]hr \qqM@g'}O޿a~bUunǪ3,y[ 8n5B9RVD/t /W'2g)z!Zrthylus+_}&khڍ rEwCNHgMT%f &b^%OI 3 mynPeY8A(f4 Jr[sHhJ U'ױy =;K+6O0;#R@5eIxevGFrK GgLEtǥnH߬H!d#M%8F@ )*Ngfa46& lyTi#*DDX 4u4`QՎ'd0‡ɲ1F\g33υ5Jљ]: H9OSWr&$ose_KiwuT`TG4VPʝrw 'LNhرҨfk$;iן<뻳o~m^}wܜٻ_Ӫ78ro# $(1+w<AFWjjB)X!k9\zLGGw6[_vM,h2j(k 3EBA63RX& SK=rsP M^ M;̈́Y{;/l99h٩rcҠK \u |K`M <}&!D0<m\r\B.Z\*%C.KFC.k9]Q¨.Ɍ%/`#*']nJ|;ϻ,^=&7-:CPbDQY|>kQXkdR:ʥf:i}&)L-RA=pr\ܻ,Ag*p5Ƅ̽RZ52.R9}e+tH~PsE O o-)G[^+/m"xԞS{}8},6ase &اLCyF)#t iL\/Fˏ^Y^I4Eo[`XM4HW/j-}-į\M2ץ+,x.kHz[l֒I;6Z=ofMj@Zj+5|d4wK8UvbCatF< I/ȇS17.`KAOj>;fya|&s`FVqwMKMpBpermDt "Z0J6kpB#cDHCYf:1t !,049AӣGMRĴ2.ĀV9 2%(v\1=dZ-Y$3PJg$DM%uItm+149rBQ6D؂A؇!!t2a;-R30K*tҧ呬*u,  {!Kёhne&XR j "J҆5uDs$~胑I'"@WBC!SHR2i51{L91&$ z3nsBPa4*!4KQ$6nR*dĕ$d $%Xr/)ljCoJmKƊ1@R8S253QD|ِ ;ZLa@.FBBJ(!JZ z#ˤ#J1NM!06ʐ $$㶕ĒΒѭ*t$ eD$`G 3%y Y3*Z`VgZȱ(f=i6ӛMwOKaRewD$)-eEw9̛HsWn_id[e@SќQ3Dta7,X U][V1KQ@+ho TER+ٕ(27x $!)W6~ZQ $WEIq@l,@xT햡 /\5$Vc+:c":M2X7rDv݊eیYUI5ŲԠ B +0- ~ak*4dnIKDBKJZn8&8bgPtڂ|BG cA1J#H*HeV**U:?xG}@) /. D,/x7VTlh +fN~TfAEw%,m)): kv `~|ʶՆΡN‚}<XM'Pz@6DK xѕP1#*^ FYuDhAʀvw!Z%tMކFL32j (N6Gݩ`Um뮘Q7kFVtZnkTpӁ$&e#s4ͱK;,@$frP%&JI>2!~2wihJ|G`uaE^Y`݁~ %dlgZ 4]v iזEsi`J fB޼jPR\ ^`v鵿5Ѧ+-I. fM:[P0sɺ6Zi3 gC5˴ʹsi$h%2h 覫2l&ns`* ]ӿ? XeoO 5$Z\@Gɪs7C(ZjmhͻŒߔ$ʮU.1eBAv GRwϒIO )WvVFu&HPFU.s|%xޕ{ ̠!GXm(1I%n% L`wڴgC1IX8]|  }٥P#HhRN#!2L.P,ٻ(Df56X~@5H;g;^m#?fv/nx!K(rڸ!\tƞ cJ$!1CHE^{])vjY>_ V[j W_ecMjˢnq{G~7lǶ//yF0noQhx;ZsDkeS:WԚK;NyT=)Aah~<*^gR?Gu.^Ju.^Zu.^Lu}YOGw{{D.QhE[Ҋ2{||oǺ 쐱7cݥے[w'~E.oݕgV..>Kk;0b Y`B0a8j[ڜ!6/bYӚ5!`W^Ŷ1׊wpVhmML/}5Ǣmc76<*+e6v?ŹQp?Bdc2BrO7B:Xlx㡡ܗ6ʟǾd=̊Mf&^ gݙ˹ zMN7 w{In$1WrOj;,'{'{'{'{'{'{'{'{'{'{'{'{'{'{'{'{'{'{'{'{'{'{'{'{A5NXgߒT_跣GҽM?5mPW;]?e/Vò/?YMd+ھjfblZ81AU IL[2zgi}jIŜX,S W6(euSKVdD<+m܊e8($!|kؓrLFԃ-gߞTYe,@ez)uUx.ZAj/_ >6ztf"3濼=;M$M'vtR )ݼ }<jY |i?70}anפGw:L7o',5٢|"rt$G'9:INrt$G'9:INrt$G'9:INrt$G'9:INrt$G'9:INrt$G'9:INrt* J;Ky(|֦MSHtoFh=z}ta>7Jq{0|>PbNK4>ʭ"g[VVPE,j5$oJ9_OQJ"ɗ8DWɿ}`ߟ H?xQL|L}ɹ۾P>r~mV<֏XNF(2i\p w''wF~{sS~n}]\&b{϶X~KljwLN79M{~,D>q zdqYJ!Bq?)̰6jlvSiC +Z87D6w`e~_IJ C=l_ߵ'ķufm|X2X93mepֶ1v\"}:u=r[K6w[[8{ˍ8cCj߾{*U{|ǽnf?]7tyΪ潴/\c/ߏ`jdM\a,Q*}9hm1ʦI4mmS{:Ǝ#[88'󋋫:a y{͎el+6'y gݙ~*&[|p}b/6/'81w b,Vݘs^8ytzL˻,v`Aʆ ǀ:$&y&)6c_JoJ' a'Dk|oWc㍫= {w0 /яw1JeG P cd^D+.XxeJL9L&GN{w#AR{PU4#J y #9H@ 3!}?%^{.)65:;lA49W<#wtx#Z:R^M'f1WR[]ԤF%n2|38W)圥40Dh1;As.W8ˋ}#|8kO&o-]dN![zxz4&5;qEq5va}k_|eH]sH&J.51.l`C )a+#c."{;$tro mA4JӁI_L~8/7[/.4,EQ|0+kQ){hJQG?T2b3mtxɎv]-,ˇbޕ$u{Y9*ZVU6g=ƍ#gLgj'{ONtMUh8t{}[s"v|sXTf\ݣ{%8ۼϠ.?o=Vv <ᜯC01 ZhU0IxGStp'x@16G.`5Fb>ln_=>q[8{N dh11Q7;"P81z}7kSs#XܼW{| ͣ_oi@9`ٶ$#^U/}qy5^3nM:w;C[\OeM.CB/ݮ-! /Ն^0gg#I1M}(BclךyQ0"a 8x4MZ D73c4W,β݇ݴF^ݼ},#ݹrbx 3h!_ve&jufSo8XzqxOy6'>E*J{rmͭKbfr5y)Ek3zL}/h-WHȮDkpJgVӸ bc RDJ)M 9>2Ya2=L0EKFה08sJ2jE XY6QZsص$Lեx9ppM^Rʝ^`-;.:w(]PdZe ӛngk]!6O'x]Q<[[(gUNKd́rtyɋ*)*o(@"7餓akr,IQqX̹Zs$"2:Ř'9xV BnQcE(I8 QQ:akd쌜i;U:8cO,dc,=>*^;|_dxp7IfWT{J7M>gy+8p\nBDEdZ.mPZ jl O!{.2½geJ%rool m;blb‚*ηA3r#㚉y(wj7km5.clέI͔$6QB(A3 8^]Ty#@ 5C*Њ4bMfB1 ?p>&I@6zglʨa ǾH;FD#b-7A-r+F53ëP xƀ V8Th %DiqNmRKg|E5ZҹT+1"vFvDt =lhyHθd_\"qōǵaTeHPR(S4p|2VH zbG<x(xw Y lq YEeo 냵)XeяZ\ލԨjFs(7E8.q!ݥ@ȧ tr[܋!rr|H(ڋ!4zg8^T2rLQBBS8n;~vf9ٿaevO)|9Yc_NSARRg#B5BQH8# b M9cMfhd$6e9(H)ʈS) M$' c!BԺV,|aұSa[|/[Ǘ曲0]_gìWmWJsI("[䙺zM8 ƱMe,^GԠ12$ 8< h6G<;-kMn)Rh>B`ZD\^pAO3xɧT}XΏXb *#\fpe,~jƫ|Qɳ]0iH>: ;݅S.ƿфB6&GW挈J8Gŵgj.۳{ TWU6]Wn1^[nWk AVt1:U=u`F6oIȻg:kOy41xRa`^żOƣBVsφMgD묌lum=嬳⇅%iC_R=zDI}p lT!Ow=p\_./wo}]71e :Cȭ`ZI&BOOmTWS|?wڲ.Y]5]>r˼7引A; ˽ ??]8~;=(թG>lYhҕz b~VW3Ί QQM8+ g:~@[w>1DmĽ$n'yʅ5^SK ٤zG\ e{^A Vlo'=aye%= IZIIo%M1BpGI h۟tr3$t{%.pUqӝϣyꡗ=U}γJЏ֪y욛n\-V^ G|g%IWxaDe^+̼W>=(|O}6O{m.:5W dqŊhIGkb@-Q9m/))C]nDa[K&@%@=Ƙ5A\M:ur?բ6>x9C<ߡֆ>Q3\1 &YnƋ R{{T&5\{);?-/E-$)8P_.J3Wf{K+AQS)A쩕 dJnh;MN=#2IRR d !MU w9t=h{f!&N']詻LEo3muS+23'L%|^>P }Dh䑒 Z--A;&)cXҲN>ٝA4[K-O6Ev[6޾y&FKSUoV.NkEGJ)`,'DHEoiZGhij,(G&QR1 p >^Q: xBmd_-TmJ(r*+  f /wLK/]/GOÂ%#wKU-Ĥ$(r>(.ah AGJ"d^nظ lbhc&Fe 0A{P 2`;n:>FN7# 5p[M *;6Z(«e3K(RlRXm|R$T[F*'(В遧L秱VsC{#VR$C* !bϙQ8:@Fmǵ2'phրc '%HNXΥ`G ѢF2Dg Jɣw&X;쀄Et8dehQR~;xc^{ّgnn=^(g9j#U%#-Ajw";T,jR ׌QXNR*太A.QDWѐc#%h8 )ϔI6޼qaS7ͿbT+#{6L0 !;:\LLD>އ j ľ^`I2d4<%M-B9Qs)#3P$iW{/}1g;>ء: KFE-b$A0#m&rg"Dʀv"` e <^&==r; ޿ʟL?SδVa6㼘u~aj&>~0Vn-xͫQmͫ|n_$\!~MW!"'()<\>Z.Eè\/ s2E8RVލ]Ye_#gNDW4Wmj*?Ȋ;@*$kwIhs{^ukKM †VhΫo$ktq2/&X⼺ߙ_Ldz|_۟(jOEݒH{{[78*tIj.4mu_^v^M˺jPm%AuSLVW TJ<\DrM}uǩw^t_,w W;Gx"l@r:N 41fT IZ69f#5&錧@}2@ gv9ƼS̠\a KwX%(R&)s#IQID2)fi2KĹ7+u DWF-481HH5[yNVE?+yעr{ 2e8.>fsB{|vMf{Y(μ>y5! Z]s-wU]>T[u!?pAϽh:{4` wDAiG=XEool`:}>93Su7͹l7ti(ۓĂd$^N.r7TXآv',-֦t;lOf9*OAsWs%l14ǭUBZp(2 CgnHe6mRh@88&DaK0%hrAZx؁4a`$svᴦ(kux[Bh )(m>R:(Ma:(g2 >>]giRg|4{$fMpDPވ@ 2|HefsUPJBi٨@% c.PEm 'ZDukvǍh%ƓJX:9b`B%49Bqw*ֻ'Rg-C=Ab"8_$͖` EPB +&Ÿ[}ѱSٌvOxcq8Û H1~_;FR3uc_Ҝ/@(jċ^]g~M6aTPЌԮp2dg9͍R5HA*+ nJ[$$Ձo5RM${U;aʴbB(iugeqQkYQE:O<~6ؖU:Ol OvzHRvZvoN2`)Owo/u%YΎQ OG39_vRŹrMk㻉qcvvTh#a<&%dcSF h0:M.:юW3I˛s[Ќ%j2yFSf;˺xǗ/,62|fFo1\2 [J]F_3Tw@-RΩ<$q ]M7sQ1vJ&9`RM.{uM`T/)b%ܜ;K2iV9l;6a&d.L}Qu0{6;faZIfl-`gm)ͨtR}QR,H$wZКRO  431A xGU:*e? STn05]ǞXq?%2U\Cc[~ 9|o\lY>:՟uS:I;Nh 4ls}hPIֈ蒒8vH c~Wac;ϟ\?'U^R%I,RI c VA!;1u/1;8aILUӜ;؁;aM-Uү'ɀ7IHC ܨDRDx`Aij@`1Zrs#é[/>Q]G9, 7ӝ|8rߌwsϳ]f6j&Wв؇NfǼvzzp> ゅ RI$C>q"X!`σFC9džԠjPFW4R[V[y-U1%XT[s =xzx<MӚ{_wߠ)RQOGoÇ+`B[_mS6,dYrɈ>z f~؟/C6`PZ '&Q8Z0' A3(OӠdi%1(wRA)t0pqH A9K$.&ZBC5F)D;4`c Sp[ %hR@mjEF8er684(7Rvc79w=ڲ?O?Q56vSmEHP2dqMNP VqJy-VFQ#' y#Wy#wIZ#L4mѾZ;j+PqC;Ύ}}@kVMwI;YS%fT_T%f oi5:rZt5dj}R}ݱMNM$^H);F:۲VV~٣# I%TȤsNYϤej'"dbS 2*#y:$ 'zP!NU1H*QcQ u3cmp^fJm\X[ؑ  MDžmV!n"Ǜ|Ÿ=P`t?Ou$H5&ټhzͽ )fcK,]/TZ4$Ő=L I3A%6qD53vmp^fl7֮u4u)u:0 4c'MlIAL"w\*IR5K߶V>LB2rZP2*CъrMK#r^.$G:*[pmp^Ҩb&Ʀ0bmcWF53"cč'<^@cAhHBrc,B":'H65I* Aq2ϸPD%Ukm#Gp_v怶.r,0sw;ƶd'_Ve%YnmĎ4Yd*C{4C>hX"6Fv3ę!8 JsDZfsuy3n{̜c S~j-bv5~Ѭ曭Fa]YJu7]IwSC*GH/vf&WR(SkZ_<{]̷XHz|^LS m|@SBPXisNl^zCPlږI(h ehTc.r1>R}"J)^ii>5$P!%`Nyc`8' E@RQk"tD %Yhk"gJ^ݒmZەo#[:3%}3 0RTߺZ`uc~U;y*g LP2(wSBQ$ZbC]Q^!y.5H# & gs24`Hh?1rv"? S(4gzh aBF,>BqIn&C=2&v,2X yс1d۩Nqxi!Y &պ1k]TW(rv<%̫d)٘QPІ乏!Mwx)(|Vy _̕:/26ۋ繽P+OקPs{Uj7˭^B it\J)MTAU;D)Q)ZBƠJ/rUZ[{5Ϥ#xe:N[2by[jU˶\hzf\8Ŵ_͹NTXl"8.uʼnD--Nk;jQ(p;'7VA{fAҔ|uq 2>q.tӹ_Cas<>#q>/wun:wﺹIE,ŵ@hs׹ =>A2O>P`[ϭir[:twW[rDen7-w^R!ot.{~+Yu<"xjҬ7I̷l}DL|a&?_0gɋPjnnL X wOX* -) `!rB*:M@inڬ?f^ eJ֯\qeʼbAQg\`BN-uQ*&##\FCŕ36ul.CܡHrD%P0z'T(\`iF GЉZI&y4w!FR$C*#EeD48'i4Mg p=0-AK%WtQrAĀ;'Q39  IJ#" VɃ 7-lӗ̺ -jxvWp`hQEMkv}D6♁F+ېC҃=PYN|ex@C8'P ,fN{C;1x1WmjεyƟUNs;xhc2W[7P' %%^v]|}2ғǐ|QLO(Re2tcKU?9q'~xGeΆK٩Xp|zKOǧ~Gu;o]>NTܕ Hͤ u Lj=/G Ӆi9W 꺤zw1]n0/լfѺ~9{j/=ms˚iu^.u;KWZf׷['wK\Z/Y##u/o}ʅ M12'ӔaaLB.-n4r6[Iw$Uq8)Zgyg&S׏ۤqJ1"hţN%?фkyC5ll!9!s_SJEɸF;u,*@ȝ;[L#&q|j'7 x'*)sxL; dI˨X4)%&,a)N5"u[ 77,HDAX4 [/%.Ʒ'u13 BkKa?NwmqnᢁpiSHB-ʸBKF Hp>pÄK3҅Ka-͇GmG-RtNgEz>xB" (NB\,3jH]Ou4*QǸ ָfWҪ JO C5˯7ru=jKY5b,?ެӇ4M{;d>z?/!%͠lfwtxєBJ^Nh$)R E#F*"]7X(Q\d5YzBPYB2Lڂ*) a񴲜€4W]UЍwP T(]YDIjx,u\E-\6r:⨕<G3`DuBJ/u;]ETz]ZVz7x/Ts|̩jݛ6Ϸci}yepWX+?HrX% no7s;/e(׏$afy|@F q0,e40ֆnŘdqkGedI֍Z7WCcQ1x ps7c_/N 7XT *vL|_?/?}w?}L?:D֒`LO"BoUSCxq0aMκZ]MNf܇샇|RZ2z|}7}vٖeԜ^W|kkfқVމ=_u .s?JC&YEғa[דI PlN@#6a;K g90`Dӓmpom4<t%%߄1 4i -.+y<i^Uq<퉽xwXq<}=vNKZ'{Nk`Ui,(h6^?_Oin\K\|r )@Z*ּY S8a 5.J;G{>=}3k>%p#-EUDpVEʔcwIRO QfyCa=79{!6& HJ fSG'INQ OM;7Fvpt_ )mZNiīOv|pKZs,aC>}8#>+/t۟죏;'x$ HCPIJkZ옷k݉?fS޽$9ZnrnoZljVP4lv37 зg\ϵKʌ8)Ú乖IBXgzȕWy˶*El!C^>]fXYҕǬtlIӣ9lOS:l`B n^]nٽWCs8 i_S?C'(@ݴ)!1P9B$3a?ǘ=&9: Ѧݴ?}KȻV__rGWo/{@޼{5^n8a;Bcg1CU_7?]vy`i؟7y: :yT?m/j͉w܍8}}wͶiyK=%uz 뮯CxwAst&$QKɧa71.,i%Ciz0\Βu.$~@?ΓG1S }#9oz m>i ]s'9-^W~smBb[36Y Gb.d<k=p8:tl:3M^NPpZ%6[m&3&dda(4޲R ' = C"nXliѦ1hyv.4vL}r 8?4 h^+]t f!%`= @waˁc,5s,/̽@Oe^,4Ѩ{rQ3[䞥M0D![CFu!U~ԥahs\,n?h;.p&)SNSumR'ExPd)92ɽ/qlݿtt<ۮ+fHi ebmB?y<-`əRPM T-BKO"LKղT- h>~soġs׻ !9.&N.#6z\vy #.1dDuDt8rbL'mF8G$tAj/4Q›5oe:ᶓL ՛c =D_Q0'&j9ōTK0'BR[0`.2ģJ=jtkѕҲ-]WJc7+{d3&z[0va]õnY]ͣt5QY3te[DDG/|*] ] mJ(j1J]).Zt{[+{t] XA=\m-Z6t] edtF]y9p)tcM{ ?7%|_a6 瀛?|?4ټyxm=G8Ӈw?܌f}:msܦ_c7=KlL?Vrok )6cFb6欽o_W7z):rɶ0֟ԒOf|* KD*]uwUAM#a_#7s-RڥI͢ MWU~F !\ PXHu!" 0a=R\2Ji]RMW+Ud㩦`PJqm5+E*u厬zwL]^<<\yK42@ar3t嚮[_Qt%&jt%֚ZtJ)Kxt"NJt%xykѕҢ/]WJItB]a4qEh U *R;͢46]PW vm^t2+/Aˬ1qGNIߖ+Ė<&)ECNy/ARkUcqTCl c51F%fZC)e0fPĮqGWTM(]WJIumERJp V3@i] 2%2"])gj(m(~Rr͹F]E5Mg!rB] 5JKt])oӣ]#ޟB|ѹj&̣5 =jGi 'g7]=!$HW ]).Zt%t])MW+ԕ5!XaXwYf|R(#6]PWh!ZHWh%Jqkѕz,]WJtB]%W4yM-Zo|RJlZޣ#bl {9BhtVZXsEW_ng5=`ބ7G)Ѷ {FP8jt% ]]:<ڥrϢfE*Uhzl"Wjt>Ԣ+]*,JtB]Y}E\5Rxt] YMW:籦ޕ'TܥIͣ%_J;+ѕ.9AEWJyl֨+FX^^y%𮖼J4rޘb[ASb^Y͔rjWiN;%WƼJ* J( GEOc: ssgdh'FӧJ)7XXԤ+d )uJi+ wF]ŀE/JC=97V J) 7]}3# 8s^|z<ܸҾYn\(maKxUoB`vJ_":wٜhKוR52 J_`lTRjѕ2ujB"]!#`=R\ Ji+]WJC u3ѽ-6m-sU-q5U}wUOYE'ǗaI&'Q\SMkZ) נO W{.lx{]Z?ͻZb1}1%)64Nwë7?F_o'is?^oQ׍]?oۣem^(s¿vȶ/..䆔C!뷻7#w7&]twW7Xv{rwwmv%?_m?+]>Ɍ6"ïwb>wK>{;n^ɕr445}ռ{0pWݽc?[^}}+71Jw ~C*{}ǥs>7crhs`ޙr1.Z ]d^u=9!>dxDȱFNI" %.vh')@ixK>t~\0&C 4R6CHηwɦ zzh̃9{"La8J qM,}2!b&S6iS]CoAZ91"m՛7o/Ku-MsC-090e;Q^zRn h(SrMJBi)G!dbjFZ1'7(NSGG`8 }JC^h#"gΘ coLM{ц,JkYZdMORUb+"!x~r4q8DK@6f01D/PD!#&6]_a`PJ: u쥕b')~D?0PHxoD0zU &OmNC'W,Qn%C2Yj& ")>Y0u|JѸIn~HI&HIqˍM?ɹV|LR!m"DwJX V[JHIB@sFGb09H' GtV' 80  r~EfJr妄b>K+AXYfr rwi4O޵ql_!yU`fd? N0I^88k,&uDʉsIb"S6-&b;UU^V ٥fّ#ѣGxHw  <$X3rVb{UPQAm> h-͡]Knln2Qh0P&ܪCZ]Hme±&6V:wD\Iփ).tlhhXWoT\s ^-FU5aNz$ƬC6,fC*ZD@ %X_;wMBA[EtzRcO-` t[ ҀRS̨H7@2!՝l\'  QA\R}2ɮ3o%Ce:|=Ʋd2*zbJe5͐jPoB+"X2nPƊ|̷n3PP,Lh=4v#]7cEfJWukʃ AwŜN0tAC\)L1R"%8TR 3k'T b7XC9(f̤cB8*VZ hM e*3[!(@Hq;A(xsО5ƒ#%d~GBY'@PSQzr{vRQvTuA"F_R] 1r 9 :W5$絤"pPBiuH YV+"1۽@VQ=r+Z-j|p.#hn!:38oG \ M/fĥ"ҊY7ICQŘ@QE!i$% !pB6}fC|MgⅡLv5в./h7kW f=fU&>Hh"X 3 o b9PT8xi7olJ?l:@GVZLAG=$]I4T*#d`&SQdn֣2:#.3(Z R|$ LjyU,C Y0ncP&>yt_V 5ȤuՅ:@vdm^z,:P% 9ՠhKP-"V1hFy¶e@TDn*BLv|WK7di"YrB,GKе 2"(QwPҧ6H!/Z@mD%|uj З9t IճDTePEx(%%l-ѧ@hg]+y r.!z-bB bjwXC-D{Phň޲A+}Ƣ Aa8J"Z!k,B$ fj2RA ٕ <=~;C VA;XVUp*XT*TeI|$,RU9QbզX5{-6?XI+Ytg#MGhTf%ݢZM.U{9 i AIX@iT_l:hIYTtaV-6`TڹO^y6gs.鰺LچsU&Җ`֣;V$LC'Kac+i0=Ilf1hmT]kM!JKBq$RVO ]j31i|ݣa#[TfĞT2j7LJȀ-ɡmM1WdsC<܈ܢYf1uA;(XfhTYڂDzC<֢;l֣bb z+Bq&SK:)Hvr;X._* Q 2]Pʢ#6:XLzށ U l?Y[Tڠ@nǃNZS0h¦f@ 2).$VO34%0U2'ʃ֞ +JOQ!,iJ ]\F* t h zuTS.\Y-H1ˡ옅jIzxIPdpN %>tR@Ւ0[ Y\ yۨO]gF358F'~`AA(Ջ6B^C_>}m_̧oKúφ`1 kӳ4G'5^f,6}a+mts͇WӾG}Z nBnnx7+D'Ku6n6XD_tz0P?g7C7Hh`C9]M¨[xLC7S0&`ߴx b6Lex8 q?l4Voeq`C %ltG 2cSfp2AXW8ޫv֡-_?nn"!~ܵ\7v;`X'_mYzVvubo՛2s'.br5_?Z1\Sw ՠD \B N/FT](FS?z3k6B l'Xظ GCW7y+R)+oL+ٌ ]Z{Ptut\vLtU8"j4 >z"7"]Eo#+',xvshoLW+w.ħ =}[?'UWu)h|>vBtuYzfDt%![b4tEp ]ZePLW'HW*HƤ>}^+X*{tE(Y]$]@]khP*tute̓[JprFUo5n?dojc:&E7TDע8JOer4Y8Jh:J(mzY5USDj2x,?>l:٫6"GDx xjB|oOAG9"j4 >z"V0] ]H-]pn4tEp ]peLW'HW!:5  ]\BW֛c+B UQY1""Anc+ %_bЕġϥ>>~؏~p{OuEt?֮t%>w蕰Zvb4tEp ]S+-|taDtEb<GOWD+#E~Dtuv,tEh?v2R aN6.ٓ n7Z5?$g@k8oÜ xbОϸN=1hQ3GQы+&@ݹVύAG%wzej[bnr qJ-[w7^;۫_o&Iߜ2-˫Ӽ|/_B.Ξ+uhdpa0-!k'E/wG7Ho @ |S$7'v^Ww_N,bůꀻ޼;Ã0~νZoo8gmvӊߴo +PT1ަ3`>Qָ#:ͧe 췙tAKKvSΗ./| "gX׿ߌf{XonM?X!M^_cE֊Х#>KE$.R@T(A[p=W܅ʾi@a_>,IE^>{@$%Zl2/rF4Rx38)N#d`,Nr6.,!S")[9wy!q(JJt26`YfwMOuկObW9 qc &_|%0ߒ9Y_bTb}4a^β~laij:̳+哎YinbSY>hZz9bOQrϕ6TGh s8R`Ek$K<l>i=IXEPBػhF^8Cמɓq8}$"hMp> yS<0{)2("A!x% o2NHĩc-uZ}G!(ڲڻ iAdGx !6pА1b;9X8e8iic N+tǎyƽ/O,r86}>ATgg(b[3~ O u= 1&p92"Xx',&9ɂs|̔C>˘^ᓟ9>sTG'U0|Rޭ Hn=y}8\f`aOrG7e.&kti(GmR^Sp _Yq>W qM\ZJ#mRr_-\ \36$V{ji޾1U&PMZs!Z8K=V,|JѴ}Dm[eahoVQk[].)rO?fus@cM^+yRc48׺2E6W@Њlӵ!3όϋU6h$DI_-ئLD^ 3鷂H!lU'3o#7Ց5c>1#Ҋ6}y2vy[|W2~43dwb<"' :yI1o8̑p2lb.}s;Iu|1΋Ӕ&6BjtݣCqqǘ} LgëZl&ɍs?X c0ۼI"#scƨ',7tpY3]8zC}8~UNnųvXK9+Ę Lb%A(9j-tR/,vb::@ O x܇߾[`N0'xzsp;pt /`/1x4jely?c$ AD!zā.^E/~}ޝƆ^O{t978PTC\xj:0ZNJiV +;WpL֒7 T?/L%(%;ʙ.=a1]ƙ;H4A'(i*>]0ul.};Ph>}8hjB@4(B[_/[#v2 PX#!J88SS!cFY; ZDl hjZc9r?6Pmy6ͻmz@/ /vX^61ݠuTALW 26 -85T=z&Xp3Pr-‚ұ3N=B:=vwS-̃P +Ltli-e)y" T@hF@1c)D:Rb)pZFܣZ^^jF7s2;#gϒKiz4`s)W|\ 2I ]*V%Hin$vStcײY{Jg9^In<3Iْ āѤZymH0pT*8%0b|܍@Z +*#R"%1dNF0SQivaG*L}qw? n1fOfeϽYa5Lx c\;!bGBjxӆ3!]cCCjz*մF2F",Zh%t {e5IU:4""Fz|>(ݝtύji#$ZFAK("N`g C 0#:nl~XM8hB6Kk "$+/?8I|KK0&:!YI}kOOMgJYjhZJKy߭<6(b^0FVRe+0SC^\HA)EbB$t^5?@04OAoacjatH\)\aA,W lh__\X1\gSS z`e_6㕒l?ia+*z$7 `c|Ymb*$WP&?dTΪɽn7eë|Pz&F+]<bP-ի]s\$A>x=0%Cm&>glO#i46I)`L3'ujpḒ{!fm{V!Y)}kȯ%O)>٤rߒciPv *'Q 6W͙N߽w7篾{:߼XypNZIЏ&G CODWS|{LIO]/>.erҠn-7_¥h&n<(Z: Č}曯dUrJWq+ѰRҗ"Dx,y'/$fMN"=/r;%l%fޤ['=bxL}c֞ߣ}+XžJ>}dթarl.bȽbR9c6([8S+u9KJDu/|Pz}6RA۶Zw|}+geqA* [=YV_W~e;ɪϚFḙyQMk6-;l\T &^6>YTF3{3YfyIܘ]O%jekp~īWJԓ+jmvQe94 rS\(@JQ_Y ,'+˵hFEIØ&h Sˌ\j 9gV$}=2Ҵ/smHu? tƒmb:2}aa18[zhћ|}%A,~s5}<]kwn'Ts7)>7%ł/+IĐN bc.2oh+W-4^kږJ~\.r{3en&.4h6!YD.5ZRIKua =`I:>YjBgs%h.|dlfϠ]0mP1g 겉.cUYT YQPEkW wl^G;d[-q-x㮇3elx&ՁL*ùZUGc)\& +W8)RO4Hg孴 {dmBBGQh@(:| 㯵!bxQ4딖wWGŬ?[ 4죙&dJ_ = o?jZOm#&1y0Օ'Ciԏvr0X\ϮjvNtyYgVxf5}~3W-b3 <á~{FGò]֪짏 <_^qo]?M/N6ka%WF>%"V[7hسoe bVhR a.s* U%L}ۤ#ue=.4+L@BrV%r%zI,YL\u<.{.n'/[Cr>7q Mjj70VTa\*J-AqN3.uhƩf[3w:pP\Dư?+NR"!!aoMy\QB4;FCcr k4iٰe/@v*4QV~<RgH|IS192!^Pq!iHsc+,3jkJJ%fUH.zr* "JX4hcqFĦ183csJc\ؘd< k ]υ{[9kivpIg'Trb2}|`YJe!OawG?&0j+ص"R6;t 䳰4WNԐ$_);SV  Nht,LHn.k_}롯 LMg㫛zI8,Rܻȥu:Z~!$deLg"a=%xSj:7Գ8e2[< X8ZF $V^nW)o(V͆ļM؏tZu:/o)l>ˬwV~KwVUy~zf&,9F:q4N4wՂ@?8|x2;{/~낏i4`~jJ ] Y6SԋfU!G|%zVoYoY;c'g3䇹߆_ԅ C5;4t~m_qEe<|,?,U=;|!0g} Ͼ^?^$\\`1Q"fz=Nf{=tG7sv]$84K /scOO/.WVޛ W3޹Hi*8mڥ3+m̗Uؼжee.C-[4 u9ۼ]Dʠk E83VCZ,s\&)5h~b2h+.C"CAGI8JZxD$x=ƚE~ܙ]^Ķo}.& ?$6*iƹh=B/ɍ5A[7B} kJ{4Kϩ>S!&Ř8Bth*G%R -բ_PbB][H,*yelOف9`S[+=g5DN(P}s juۥcu,)DsA՘Ӧ Ѧe)'ˌ+c\&Zo|1L SzksZ~[׎xbnܿdmFAŁv/҆Ndt-Y<*U>o"d UP*UAW([m~:,mv\j{g F9T1Ye:@Uep RF.+(2Ypk%C*Oo^G 0mNz:Cx/?OipL:^:ޖOǶi7|oԳ5S*VplgU+*h{ TN*GBEN]!`Ǡ3tEp ]Zh=]Jzz5t%zq'aOkl]= a0t4¶Jtܩi-;DW"tԦ#+!%:W+ ]!\DWjvBv*!BLgJڮB Lt0t$ BWV~PtuDte+zzqI.Y=탉0\~HTet=Lf5Ga.p4^ɛe9~p<$'77LD1e.ǻzWGPN ` 5]T)]72_?u lcM\ǻbM Z[oMJ5AY%BCKWufjv"FtutE9Zu0XWˮ #߸?F) +Mբ`3"%Fti8ZNWr-ӕr)S;OաsW|Zs-JtܩVہF3tEpwj=]J#z:B);DWX+tEhi;]ʶuEJja;EW ""j۝:!BRն+tEh%p1ҕnġN}Ҡ B5_]k*~%V]GJV~W;3(h{c >5h/A4j%Q$[YnKn?uZXgmuszpdݯË\(GZq۫O6՝{hٲ?u ͗wwgB#J^N.t~l ;f Fonɽ8+5si[k58^BDrKqyF__RtϙҐЄmӚu2_A;^/2E}O$D%kO@ aV6Ooę:eKoҜp)&ߟ͸cMN^-VQtmj'om/)w⛏ui)D\*nJ){c5gT*6/Pb)"3~ֵ%JmOW`˩SfC? cNxCU,zJղdx]AOWϝz8]`#:CW.NWr-񷧫+8Quڕ3tEp]mihi;]Jz:BN ;DW•w-c+ٻ6#W>$_a7, Fv7kd8lm~3$59$E$噞ꞪOFWiJo:VJqNN =Q5YZy!,2GUkMcm'.ij4ulrN0v>9uڼ"aT0 aZN \ \eq̄ei=uB)EW B= Uמ:#spRZ;,1ڜ\&3qU'Ϯpi8y<}_ w4W1 f0/ZJ7xeq7p6 _ 1`>)DL(Ʀ"&KEpZXv%XǴ7}o>Ugc[}a؜JfW'x}dd|uA<ͧ+gy=*ɇӤ3k&VkVˊ.#n+.7{Vښ-X\^'^45c庒bu3kuLn?&Mgv*o-S s"31bDzLPntul2pBut#<>~7q EpNG q>Ew5~T][%F'Z.D\uȵ j!ezi s׋?QRY rt\p9,Whժ|[\wUK)W6w4? ^{@LďT_2z)Oz AuN3rCmpoSdmb/`s}|?W'UV{C!F%38Cm Q p1SD D3SH,+JT+~aq1-䵡ApYxH#6/7! ~MtvnõdBD@3rJ9*|]·ܯFq_ű<m,htz={?% pJYHH$E1DL4ٚr>Jhݴn3 TlPrȄr:>m5ԔGCys., XVL2O]WbVRTBe7(w~G\ ]D*c jAn=/k\mCB#iB"ΦI/P+`~[duU=iBn?g^_y[.lEs^XuÖZty`!wVNYb]uA흌O٫Z}ڟKucwN=˴WU͠ެ*Ē,zx |9t@͍a>ouwwH͇<~0e{*^f4g{!hu6yj*a{X,[{չ^Ե󛑏㵰E XCq3mZed-]Pe JׯKmg;YcB݋ ߅v;{W4\C6cE}4Bń?M:wV(-Dର `,A$dⴂ(蕉)&9#@!E{5bM+ #b^ ģS5@xؙ8r7q>T38iLjH{Dq&! "e<:k.dW@*aC^+y2:¥* N(-7yMJS`) ITQQͣ=i&ЇTݵcDL*vj):;ӒcqwwN׺(+Qr@.|L#咱B"s <$9q)pT38<@X˻bZk\V?Z.|!2?=LbjN/o7Jq".p!ѥF O^ɩ-<Ǟh!Qt* p>1/:Rea\IF)f9i!Sk#$((S2QHM@Hv̬;sͧoOh#?vhǐȗE{J<~mV[9xv3{9L m E`!u⌨(  49Ԉ+oѐOYCs&ѐ8SڈQ)b7@Pc#LFf]2/\u48e,gecXGgmz_ZZMr\[zߵKU B4r}zQgt/\HL! C 9ҵ{O׉KOٯCuT_w,Af1\Mbs .5rU>RocЫT'OJ(D h̓uR=I6 # 0}:.$!zNGP#!F;fs%;M#ؓᩨx͒ХzrAnUc+vdȷU4F<9UZ%S}Jh;D X>T @ QsiUbƯ&CgS8Ȼ *D&F\Ж`Iz³3q |IAzK9Khޢ M4D|?5yBFo!\tHJ/d_Pv*e+ET>8OM&: 9a^"9ÌNɲ Idobb@%vU)FQNi6$UrzGN!yc h` <\;Uu+ր`ʑ>+(.H5;3#$0gό|왑V93aRjvZgF)3#Jf<XTb jTT S^ZԂ.X@=D$R%xn$ =*O&A04eZiSkQ:gwNhR6igË5sv@K﫳[6'Jg]> yj ~2Q[7ņpQpA(aV="AI{6CyJHTS'%gnbQ&ei,0*2Á|,v 1qcݧD~b*58;c޻Ȕϸ^9Sv=>~ >xs82QՔyłκ"uQ*&##\F*gA΁O0ƫA9r>#QU݋ zfi|QiZ8jNPCLhMyBJh5$# ./ո^"lb`1TGBQ/]Fx$Dzb5xl_o m`;3ӡ')m,v [Rg&A &}D%PmHN'!-QNOu<* ==|mc}JjD(QH}rM(Ay4`;N3q޿on:9D{" #;: er-*FKe04,2`AFJɃ wf'2Mjb.C&Cz)ڳb#g^ QS!.rBޕ$I^{K`,<{l>ƅ*<E.|L$7`I2ʔ*<t亂5:>^.__>:}=cb7qb AfY egSa2;'/|t/s_8gGnBڗ+>>:0oQr8&Qq]C5H6q4b{L5ZkQN͇VUk,&T&לuR3 W=mqΜHꦽv^kv\$[iյFsk۝.\,ӫݬz\אfӺŻq\Thjk]Q\kZ hN^uV8Y!3Ό`'ӏN^]r|H&Xa+e]xY=qy?ƌQVft:=e_$xZ:?z; c?+Pw$fqG۹NMYsWU^g<2M;;[UFb{[z{\~g{#2`ZI--TUٜ&sZLGdUuMFv\Ԩ:,x04 ,&0y,b<1R 4rG+Ę LSd –4PkT->1!5tր <ϿuAA"K=r( ZiEσ A BiQqFm^#B/qg('])ro45! Za鏗`Ѓ-uA;C(PÑ% ب!cFY; ZDl hjZa1rq/w?LU,}ss!t ~_J[i]V`ʌ?JL>%|;gRZpkCfQnr,8˙7\KGq(ImF۪JtEu!ri-9#m([d4x::PAX6e3iV{*c^,r :*5xJf|FMAX.X8 SJNGK uZuOG*#R"e*2ͤ"9`LEYJb|l*L=t?6A Pђ7: ak5DqB\Y +m>0h4-? fwج1aBG+aѵ& AIҵi(7ҫ vk77c. Bh_*GtK=KK+ H,R\*pe5|j\ YΔ?{Z!I뫠 INI/ai 8 U 1 HXa?>ԛ~xyMa\"?#h$Oi}y (w; 4**cX1@ SDL J1uf4 {}8< ]WϦ*aYi0@oacjatHp;R29#_ÄtiN޸'d6;}f^*p L!HY}5jDsT([Pvz \$Yh6a20 ZI佳77nbD{b.!HZ"|C_(2~{=Ѽeg|@[{%^n Bf{?tW|q/r,5* ?:q:s`ǿ|ݛߥ?>ׇO_t:}7޿S/0`8%AoL 7!F{]+Tߴk5֤QWN!{s!30h;nvғɷ=3rɛzVzAS |_`岤mgUs! t)/UbIdI(a+1&4:-#&ZR%#Ht¬Zj0n"//v"Ķ#̵#AS"aϨF 4m YaH4rSģa&g=`vgǽvfʵ^;+N^;Uγ`<&wU^ã 9Pmh cqbrţ)JFJQij'N,RSD#M(*.gpC59ym_){_S kwm"L =00W0=MyZZ dI=,R=+b_h[zA]ތ>M"LE>%\כ, 3-e Æ'>݌Zen2ߌJ=rư/ @Y/gXϊCj~up\b{?UP|PT><2ݶCWwٴ4uVg/ݵ 8r~UY4ywrM;.i6(wƓaoT c {뇬V9)Cb/wWuY֣ 1vF`I?yIjazLV?ȹ]86q [+(sɭ`)$ŤMds˅`G|<.6YZ@\0PTJnLhx4BGbsKe eh6 kTz4a#A2sVk/;^s%X@LmY#[ˊ@ڥf )_U?)/Nݴ<O f.cc`jlW$Cplg2dlM;sփ))Hw00οyMqHuuD9h:yjw1jFhFwxi5nZ<boeٷNiSf Z0qZN+h M]Gm-fuq?y]$ߤ oC3gW硗tb^Cfa )]o떾\?zU>N=v{X*A7ڬtÆ5P-`ݣm-K öմIbE5--pR^5NYkMxj/9v3|%gS4aJ뭤_ ]݋ݟgeCu#⯺{JL*z QƝÆ*;ϝV>&\{QW6LASyr+ͱ I#UZDEvs(jtT(`203(s#^*3NB0ijA5F,B0#BXYL^jʈ)?1h#2&"09k#q`vypr 7] ~ϵm3 |n&{j0]^5}o\[o'z!|T.Rm7Ky 3Ih5f]9;8mߔZäd)٥[քyv?L"Xj7`Dz&D%m& F#q~ڔJJ/*Q鮋+!ފ(,*' lDqx#!Պ( ͖~2tŲg#mf;KnEC".j{PM%{q0SK(/y!W2:?{wc`r(:M}VSaE,&H)]tFRKTnC?>}VF+%2s Ҳl2SS<`s-ؖI7YehWyU^c&v@3[8{Årq-ߋTzm^eR<,  >=?d|~S6W=,WV11|G`zH!\f~9=B͖h_l6Vύd^&(푸J){1 juql/^0bG*,Oj/ /{JQ>@F#J *QzJS|Br)e3%)vU &*$yZvRPW>'Ŝ)s#CԘy+ Mt-l2.obl'tsAoja8&HuF“flIRh)݃NJz=koGHn^ɞA.1jDjIJr~ÇDIhr$6 Hf5=z}qJCBK-u YMJ >'U]]ܠ?'J 5.:Jnuf6wM&p50W uJ)/X?7jX-:LS+GEtGE>;[vU]-A3t!$[|ȵ;XJBsVj KDV2j hSlBA0sgx;#xm e)8灦Ӌ 7y];i]y\!'~N1bn7?ڬ%3MI(d$D2K-3ٚ,-T ӎB@)y[#4tJ+`cRK\$㷦U!}Tʐ$Mp Q.d1tΆwUƒҘ0"2[)I bbm(j%r)ćjɋPY* u^jL6q>& se҅/| "9/+%HX ky ʭEƲARŒ6e^eB<:$L%hlhlDixI}lZk+.ЕHVi[2gС0ɧ)Ll6i;{7ɿ8\:EϾy79Dڙ))2UzYe1C֙KF Q!0*w1 TZjǩXQ=%\f,ߗLmYun P>Xeo|A`A;4`ڇjzKXokD^_)҃^P1'!!P (Aa0Ѻ(mLJ_^=Y2G/Y Ȓ)FSx}(%DĨ ̠!,}WJw)?s/iЏA_齾h> g5n.ƣ(T p?©mopaիoG j69zhhb2QJބu g%8wX~ae&7ۖYIk/VF852AVsoe~iU7(]R sz;L2dz7jT_~UC`P@l;j]陆rb'^z}Ě kggm i_X,-;kS>* -ű ۠YqGt/$ǻUېw}o=8ZoPoZU? &\:|Ӈ(em8 _{I=iα5맮Ln>R-{04Ԏh]~Mpfyl}&nNOqǟ$iyL<)x\~|84{vna.=mvCO=^Ga*x u A8L1h!EMlzjP*˃tIo#O6:R%ԑȔ$. Uc,!тTbvp[o`{LyF{^Gwz,&S)PLaKRmFlFЇ\#er>(,S@ 9Wو)$D2{3Ojg:;Òm/_<ŵǵ4HaHF!B66#6\Bf rR!FY'D*c}ᇝaǾ-+< 08E?Nj~mΣ3vϛ*l4f&c!Ԯa1MdҘdAoC #B*߁ -r{@ȭ`>2X4Ail/L+%cMV`SvUYeT9IKMT`J[["d$ xcͺ3p6 ELhE=*ؾ\1y fZV~1/G)dH)c$ɛbvgPgT%2D6g) -,GAP~ Xpm%H. M예;a"^h]VK{{bqC,U!˃\H\6gM_u^V6KknflOd*!#uGb :o^HFXQ4%%jʡRg=Xq[UBJT%,..Lq"FQ"".bc* ѭNwYpC) ײ4#^.Edr3%!|6X0hUّ"KC)|V΃7X::PV!$]дB.>hw[ B&l2 U},=N- hsO{VOM2Yڄ H֓MI#Պ$膴oݐJtC $#W1xBʠ>(VGP[B&(E$1A ~']l.YrκZ1,Hc6O^zøP~~ci;:z/ybN] zҤ7vB#bN220>Dƽ_u5>7|@lhqk{k}$<)CytGoUJt1ӫfvo15cVIͧ@ 3h\~pnn 駓;:x{R bw6d= C:~OWsllxg_rϏ'g7/5:޵4Qh/9GӖ6^xW3"F'תT9ڽ-=lto:MΫ7w /o*gwqJ iުp7$j.&m#pSVRWV-#W/en"k&LaˣjYj5gɺz֯:Uz UOs FGN}1fUz|,Do8=Lj'{:Y0Kw^|W߯9uwO'/W߰R픲3?;K6_ۮK3~nt]>l7Yw+ֽK>,dm?-dkK@Jp7mĢufʂ cѐJ)V) wK+QRI!yvY扣~)?[ץ.u`ηzsޛ}{=|fr=^]L%PRĘ* (rp&1z/K3xgۗ@t<{6T`E!OU3%-5m7ށ1OU}GpsP&Be&j'3gJ{9_)4n!ص{x<'1S"ijwUETX%pUeDdI1#U}Nv/W[XTvvZ Wk8abS]GGmߩ>8B[X%&N/& ʃMՆ5.6[OV%w+66յ3)|} nf 촻[ʸb{t/' ^UY ` 3˰!FDIa$DIo㋑=#i.WӸ'NA 0Mt$FID{ ^*椵 ?CZbF4B6\y| iCm3P) s8R@8zHXLO<h2=RO_ZuPU^}Z7xϿ\`',ZdeӺB\׼GxZNY]ʜewlv$DW{.ok,ϵU1O j<ي2t(ƌxiѹ3|t uƕN3h4o@[d$f!Dt 'Ŧ7v-_Z;[)DR\'ŷm \x-OKkՍGʀ-HJ|R_}xsY˒ʟ@4am8pΨ+bPJ{aF^%,HNe%%- Ejrz[L#&i|j'3 v" =y#GBjRg :H*HB3"mOz^.^ʇ+jp+A\b܋!ܽ=>M4FHc :8_¥)>F,sL΀r eDX>\8ҕl.};R^X `ְKPA (աX#!J88SQWGCJ9ō8`1waʽjB/hlXm 6/E?ڳVbf'?'RAM&,H%B, 0eƟe!WO$b)l\^~$5_QXSp  F]JQzZ'4˔[SD#M5=Up=Oz/砂Wx;ef m+3F> Lo;ar*"v9׳؄}NK̵Q*)6([8SPٸ 'iIWʀ{ mhQ2w2`6f_GTxyU}D+z4eߚ!C/9P3_YפGWGSJmΌ#^;D?Pf^ڹγi>uS oױZ5jr0*3.b71 t?ŖbC6%g7a$?PX{/liumЗcp01'oPkvk7pa~ nZu04/ג åvE?|C)֍KZ~8-)pJR>4 RyeX3min09 ."y0HuJW͕*O흻 s) hB[GC$I!K1M9Z+GkLR+Q(Pʍ:LdԑaYqGkZrH!Mgh  Ct.`'kL\qSMM0V^ޛp7MaJ9@!'LIV(c7FE/LޢRCQSB{ŀwyGH+;H rH# be}0z)#"b1h#2&"ݰE18[#Nqx2ν@ؗmLЪ󹹟ݔl*y1^=4f0"DPOB M\ro8I4f,%BQ v@QAsgт`) NP*:d^yy?+gu<f.Lxq7o0A5hSrc./jnH˫k#.|T*ŵΉiIGGTIڄ[RDv_Nwq4Q=tj.m\U: D~j۬Xg ʻDWcpg*Ŵ+tvJ( q*q2$;d\BWV\ΑQҸST {65CoG-|gM::Ԉ h+}c>>i h gF51EZ4i=U٫YFΡ9V|g(`?2Ԓer)[UW) ol4}M9!~ /߼&YyD99W\Qy%L3POn.&`;#M$3VK ꕟs&WZѕ JpUg bNW Е.j03trJhi;]%Ztut'K`sj>ĽtutBKt ]%35KWvvJn9I$1:9]Xrb .;2Zy"?PM;Еjߡǒj%’!:CW ]T^;ČtutEKtg@StJJhh;]JU[Uug*+tЪKW =]!]1tr/mrԂ%ϱMG0oJQU^ze,:u~?I4fO!>[eǚ;od~81c.F78Ŝ9a#:F 6bГGh9]uIMuw8kZ.&7Xttx2` jj^o\ۺ>3̮0"lP};~+Xb^ ҕ8խߋ%^|&&R]RD EdWUBZ%4lҽEJj"]`ͺoJ"DNW %9ϑU0JW 䝡++QgTKֹNWjˡW VONW;v{X*t %C-DV;ЕjߡLjW803t:y7%C3+"CtkԶ$NW %=]!]QL+vJp ]%%iW+c]RkN7EW .V]^L(E/]%])0kuCwzh~rVzq3EJr pkwj7COVH?LNi UsqWY?JޛJ޵L:u bާ`0rff <6`K? 4nbt6dY((qhJO%[flk_VhT?*}yxn.>tyo\7W62]wV&3_`*Gx\͋+xfW]<4_ *[\nZ< G;ٻ8n,Wyi$y>L^33,K^Ö,+meHVu<*_w :|BxX06?<4N|5.GC=|=oK W}O//O=_rRRϖ5;g}Ht-{l1RMqb!1&;Ʒb 7_:7,0o_^N_\c>Lv;,vdS8JnTRζpM&%6ͧޙJH&vG-`0&W(7rB[r)pxc3<5sg`o~XiImCP06B[J->a ͚`ePBeBE@iC}TMA[[K k<".0 !!.A 5t9ԚfBa@y$dDZQnL. #/UY {s6Y<%ZM=n,\FXud;J` lj:CGw%8CUWU]%8otk<0 :+=D˅y MmQ"z;]l,@uVj@$F=ڰM6µSE &/ (OA/nގbA]R4Ϙ NhNo: z:10B|~ <`ա_Qtƍ{/x V|D#Dm3bka%a1=CwA ,B>A0m%.fi#fBmXg O`y zBX .D!hd5F !8r(2Xuy`0/1 JWld&;n][!q:@@8>.d!:T?^ u穳 ml2|Y"!NC/i>'~r߿8;7Y߬>Y;eb&\+yot䁗΂E˨~uiB./qbUoFE&}TpGQaL)Xf. mM!g8Z.f}  ^oBZ]SWՌ>xX(ZFaσtFwp0elEP4ϱ ƢtV% PyHM!jPk {#pD,6xެX8Wa13(cA9nPgiZNWhG kUG٧Yj3y o3< [a֨Jn,L퐄\2}A3Kp0|>KY0N<]Ngy>/^^[ΕL`>`޺,l3S@65g{BggţbqԾcXsZkQ3y6hY#h4vfcƯ,c˳۬*3659x7,JxKtd@r8r* ~Bg؍ڛ|=Ltp.Jtrw `RK{hJy}[M3'K`?V&"[796 'w e1wh䍡C Ge^vڢ qᣦIG, U ʏǺ6 [c <4 [hcĹ[ڼVzPj@hlLz0-dđ/(Z"v%< hd)F/lnT*KcP˱X(f|uśH 8erk7 \Х5>]~F ϫ+{-J#$QG 7[FŇpz/]KOۺ/iρ];{*;?Cke`k{$-~_~ov{ί%Eo/ANv7k>;;ˏvׇ{w/n2?F1?dW|ҏÛ/ /Ih~9?‹o ]O:]xwn='=/nă±u]?~g8㱞v=rL맪R[ '֜/HU-)m!&\̍'گD@@&cHoc WQy{{ı9 o_lƱ_]7_߿}yTiC{%?ܣ@;o82w=zw^{zu 2~o=TVQ_~Gϝ{ŗv a[Bl|ipZ^͏_a\kښ=.ߜ/:(+g`Cy8}P& [cݿn_jZ [{v)xm=۬|$;L$?`?g~؜)Ks6Mioc!|ɰpQV䧎$? 7bZOv븃z1՛NOˋ[ 3~ooOH^8JB ^O?gC6&` 7V< =3ٞg ѕ|lg{:~9悞:]HTztЧ]p 2oZPnzt5TڒCWQmg{DDt{s{ι W6=jMPJW2ySo6`K|p㞻'Zyh=QƧD6tu^L E6DW=J ]knO&JVzteϞJW ]m&ڧOW%;gHWn>pn&\[~tBt jvasKޣqw.?Wsu{ O<yEͯ/2{G'ȷŞv9e٤ E+O[btuyHyX^:,yW?Xdr["0R/CKvGw?SRpY>q}V/;^&>-w[Xt8UML}Mb?+F5#ƨ'*+e[f QBI7g%f0aV<;u4Gz[C'=C3EV%aґ\|CrݻV`43oM#J˩DuBP崈`uͪ1P{YPVm]#ޘ<)d6ӥ-᭙ew~E}|_Z_o\=kB9St0)!Y 4;)[&#Ho3PoFs5#?D`fLEf.BE\q\򟑻|怤α)Lu/I?峜!mܑz1"<Kl$}ԂE p"$JO5DGSq|9ů}TA\WϦ_^/f:$? } Fp ?Z^oiOgVuooe8zv ZDώ#hà^ [(tش 9vLJ SQ q|,ŜM C*pʻ"&Kx-|JYVWD6%DYO44 =G#`'!H>PfU&Zt.F%S>Ejc/(9i+RJIyw;xJ{jNMI~RQ:Í3L M @7UB^K$&0kJr$ʞ3e'/dYgxfu۽-''l_20n?~8RBkytA'$]9,a>@IuK|2N9=8&hջcxRx#  )|Hx#$AoFkr>K&ۯߐQ?fe6ò{@:UH swE6Puv﷊}Bt| n2[YfTAMI|EIs1s)*!:rbt?,VXy@f&" ^S!v2ucpmQj7t@>^,ryL$}ňsp2~>vXj9qJygT1xsp7+Iɸ:7YdXo{ Sr;.5ε~1hn>R6Ve(tŴs=.잤YU% ,Y̴jw}j^_,!퀚WJև m>D+y*Slrggzz}ryjΛ?/t\-)˼_Ç.5ɗq.vH19O7i~jY aXCoPDi!2t= XƁuciYL6g1kFN*5%L$\&|2jE ,Y.(- o-ev;?սhwOLu:d"T$KjCPF9P y"MO"H:+.kJ#9=yΰa}aJ4o#"8I'u Ȕ^c/Ub(Zs$"2:$IޞMS FjH3py^3 #(cQ)PFُqڰ68  m/k9̌/74y`|,lZpa/'o$r\4r- F !fgK]۠Z~9`{8˰ɕJx%@3 :oGML(J!&kg?bV\̦vmԞסv`mheܠ{e0&“2J}L$Eg+߮VUtmqX}q=o|3kۦ^P`#ykzjҖ^V4RzQ(-kP%˻r@BY(1PsN\ A큶 C+ɛBw*  :X:m#]DFZ$fm4%`LHDI"s +ƜAXdR*Xi KH!$#]F>.qX_3soG~x\ƸըȘ%l)`'Lvam~Ŏ wEeڈ.g C/( La߇v8Зh_oIVKäH#^wy2*LZyЖ&ϨMSRc);wp(DHBX䈟II!cPoA>kHJ/Td_xaP5 RHj˫g0c&%q3\,KOo0fIy]UQ,@:‚J'J'ZH ZP.8 G0tlA66!4m[{ 't6'2vً3^+]q~`l|4K,B ->EᢈTBb /-9yM~6NM۪X=?&$5Қ J)Cp"*Ƚm:%*E`%TIn-bV:$'~qoLuǤWȮEϚ}v_E 9M6&KS?I(2mcFSeUd;3a]r- Ȉ,kP1=ym}8?9dck z@tR@eP?&DA$M9ê3Zwނ"[+6oȇy&@G&y.(QT:^?n=Xr{' *y?]W!wqTe ׏?}uPfuLN)2_]nߞL=~W|C&z|Y7ď/ox4>p~iZs<9HOqON#$2^ON Iv8M6SAe~lůy:bM\NmxwtSZ_50-լ5'hSm0? vqΜ̲}>9io|2kk[4 Ys.۽]Zڧ*wqSCFl0Mx[yRcܸ42 #նɗY?(2@j촰!dr?fO$:a&gYsTB.CͪȊ`J`ǑC9N`dR=;O]yC>yo.vtVc,W! h0옦<3Awcj%t|VJs?:6qͻ~q$UEYr&~i~ޢ1|'y> f%{hhVMl;^,68;mp^P=se9s*4"jɸFS:u,*ħ;'5w1\O:;ľ>ߡ h-8~`f؊R.Sς!/d3`tlYFŢI)1Dm "`B9Чr7W])olY{9쩍u{[S ;/\.hv{L(f/] {i6ޖGpiSH[ q%HFJһ fÌ(ly+Z RtNgE z>#DN yR$LG#uJod?Yɪ+Ws(]f5h `fALWW"VzDv`&*~,v3^lDv_`] nf- *$ä-O+ )ʕjV/ };@tG݊QZ$c:V\4r4uQ+y .'E9MYGnU)~BҴjOZne{Íi qrzAn2gft~(U#U7t(l1 Kqe+ARw*IխSɳs`HNPyz vohB!Ю7xV\nT9B/~kuMɏՃ7ӋoCb4.<_σQ/fs.] ~Vu\brq 'ܓt kFnnGJF cm_=sp9,n^Y|M6PآI6<7}~_W*ꔞbcQEV4'~56~pu~~w?{}ϟ~wgٛ9{kl$<_E~75]#ѵamu6탇̢JZm |~= &>;VF*F{qj1[m(ab[afL _7 .s~eD kMcsMl'ݺ'98ME9kA )’'pNmDד}0ye{[yHVR*QH-pI0Fhn'T[pX SvV:i'ڌYUm;w2\ :;>zOx*GΪ4_MJ;y5Ñl` oTx),q$NydS#]Fwt.)47RD\EAgUYL9vW-¸` x!,̶x2s')"lc*lFƟF Lfy37 >I][·-kɊ7['A{p=9J2ѺF|iNs]A K0yRzHle/D#ucRT!"䖴r$ XF E7)@3ViuNjmUe9`Z$b\\Ⴉr1+X#@Pk1-붭fx=ei.3w1yfyƥ:_ 8i]\p G 6htjۓtNǘ7 θ~;OCm ##:$AK7's%@\DNQ(֑VJz}JNKROuk+acZ-Bt{' 'SBi6зҴϙڝ\@&Dak;c\Ptqx)G{4 (}Αz(qo yc Br=/ajyFΦO0"P!6Z`I̵䉠(!MBEep1'37.6ues`t<A 1q!>,c0pn4$f%|xCR 3|*xY9^ -WVgJ֯/s18"筼 ?L^e Zvy /8o5\\mD!j3r}M7PQh@\^\m5qL G#\CٱL-7]WJEzqB2vD*lG\!xUVȮLҽz H\!ш+$AWZI.2Z+!)Ǥ fH\!cW]sURE+ނZղ; nyavSVE+Opfъxu}0A*r5/SN(*Yd+sm+efŏ߭$w{ %I.Vv^¥z_SKܲΞxdbsDA+_^�{q=^ rCd3y+"Nppz-~-5E=B\s4hBJIX-aJ_-D4$sLWH0#ǣdr9q*ծLeW"ŕVqD x,a\oƖBX`RC@mrM?| cȢ"nҢCh[I8[ m$j9g>s<3 PS{Wg+Q#\`j7̟<#TUq*Gg+@~USB› jv\A*<hz'1us~`=WԞj\=筭+PS-e W(i_pjOYO2WLƍGBy+ K@%'O6z+@%P%sĕ03LYPy>v`pGB#c!DHO?*Vi*vu=d9_l4&Oq25jRY~DNOFF4tTdH c0cʼn`v8?-<2զR*8;{Ui3{*xٸ5)Hdڐw?ʜFΜO@f_/9Ւ&q q,R1,2cX$ e]P05— O~U2]pJ'nid})}5m`(a{iA!F{"~VK#=V{+/BW QW+m>EW X?C>WSB›lNT U-ճ"[`@ v\ʶx\}B8O%SROBY]θ]U!f0ÃrwPlIb-Du+F+^z{Ԋw> i4ÇiKחPu:Шrڔw|x;ݵ9EK9gճWOҳ A=kd{'P+*1)#97V+[u壬\evZr)Ti(LFֆKCRKq`0,|(u1fBïػqzKeOs6YerO}TXU2pΦy+?[\#&T+Mq*u9J+FOfğ-WP. v\Jս;G\%)Kopr%WVݖ++(p}zvr7 Zee\W} 个'o֓kNf-z*Y&TCMOx+,7BR+TLq*pubZ7|l?@\\}5@!*9W(+W>BRv:C\ É'_+fbvQ^rRdM/Yܕv8M%ģQs#V+/*U*4ݨz$V.:!!l}%w۷F-3%Xp QӨ;ӠRRa 1,po*oܷk*hDiPJ MJXbAiNjU"Jai"R$Z[ǫnű2v(Ij!ԶWԨ7jf XJLjL*&݃si*)K#ض 2"|U4J30pW WjOyT=m<ճ='_?]O. Jv\J%u#,Z [$@\ZTTɻ,ĕ68fgd4ti0fstK܃O/{h6<}IoW[ӊHkE"*C" /{CR 4bln6`}~nfEDGy|૟/JDvE+"^j]v7K37E8sڒXf.Ki* )!O\´d4̺DVhN1bbgJn5!i^/ܠ%^h;MEW1j]-P~ft8x_r=Mm½.~YHN PY|41^6\osz?\&w41_0.aY* usV5,y[ҿ_qlؽ*?iAyǜ/rMh<å.) ^$@>dS̝E)?8F%;ognMHv57ܨ䛒(̇ȔQQ"eD23A,s2N4Fhr: c+袛?^G ΀@܅4}u_FL"DUB1qٍ0Rj)(#11qE䎘RmO,mvYϟ?J.Lp6?;\>/bHwn_;ޭdՏY<-+޻ZWOglͦ_Xw! h -8/߿].(p>Qde$.Ju߿eqh: *39e'{fL!Rq(+7\~Ԧq#C\Bx&)x"hSJ2R%2eqҲ뼍t]| UmBcv%w _]6ŶP-$nF72s0XEI4rem)ܺF4 u|.Fgz9˥njaIӗ? .a.^fn@ 5,j S'MA, xKѡ1႖652=v>SeZgvg~™&\_מf UAҷ\M)2[E'iep=)FUoy>3xZ}xnmVrȌj,Q,U4qʄ"Ftpb-PDd\GIJ n ̄85qꢌrTFU,E;jRb"Pmqcr /A0Z*3rXQp:.=*C=0@3X/X;K_;/=*|}3~r "Tq!fhMRwJhuah$:wDx"/ܸCSE2hc&1M5+d 8F$:!~ ȄX 39KH#=HkA$N 4 Ѥ7sǻ*&@qK+W<\yEm<` X8l! 0} ZM@2w 7ǔAE;R] Ljy,w7K +nWweI2;" ̃^h)-QHY7xxI*TчX̨/2#R'o}ӡo^)R&w5$׵;q4^ۚqeL[`ݰݚW]ݣͺѾjv| \[mn6!2u/ouNdEkh&eM~Qlom-svaP1K'RwOyd$:]lӲY.l.C|c(ҖVEc` YV7zv94vk wIHW4 '=TLYY7H4 :6oi={ҹ\]vvEa& 7 꽨ߓv3Ymj㶉 Ym֝n3*DKGS5)6;•~jc,T*D1 u) `lA*!N|5,,W,/G}}y9mNuxuշ=ԉtx";+oӻ5kx; +PR LJ"x ĦETR2a/ cOzLzUc9myrs>nA3x,v^P^.)zRtqűoNYBqUWg~T/v4W/X w2[[(POSz#y'0yҁ}FyR*u\< Dy_0KG[wEbSƟCJoK ZTqo C A|LBϜY"}ϜdI 9 єMJBLBd-QYBuK9(IĈcY$m(1+2(mXu$бxqDޙ8TWP3GKpu%TS/p1޴cqϚ1ش9~`_ ^m9^4iZК\*o1_B4QKxҙo>fHB {,㋮ޙ|N g^2c..2hc6)mDB%xeZ)NQp!jHƾR)8R4#=mv)+Ku HDqyQfgNMl.,-TIa#?=}MN nFu-S%_gIYfk? ԛF=W/=]j_i\ _zR?5V4(*~i(]]y*.4uQR ur%NNi-u.qG|xz6[Mc@ª6Nӏg~<b]Oz}OƮb]7j}7v,?!??oT߲$Gv 'kEWW IݻFU߼k&S|~G^Suøiiߎ_XiG[\A( +)o4g8YOj[/R; c:~8zKx=`}}}T=>?h,vIbctQ{fcJC1F&dMpT= z Ro[ f((:a,2%j]B1K^G^{y[,Q(LHH^2(9I3]"$̀ C(3qv(0{웎.v6ʙ2c}KܲɰZ9dۋr H9@ؽ4#bV@nh$q'T^2HE4ZO1R*Mew+cGFAFkZræǹ h1'gQfW*[o۝emyYH1t [l}^:GKFb"DQ)e-8O#BrW?EBPy"$ N:>y 31` l27|ӞT R9؜j}$|"b'!慏gbfR'hw|l'~װ߹^.ghhOjL ;Z ?iO/8G^(^IQI1"TK"~5:gK1IfI.C,*N(0\Đ_ m+KyG%&/:cyݣ9cY,<䕧h\BP6:UmFQ$0.#똊XɈS z3xu7w\q@}u Fx?QpeSQlCTOF4t}Ny>Z/%!Fƀ kr ً:L=3lsP%ѩئ+UT%$HM'(5|6J:c@l@{ƈ肎^ɓF! XԴ&h#c@:  ,%/r=,dH&A"yz<O:Oz͑~iLI+OqdiME„ʇ4oM}e=՜C-r\2<4XW rlK_ 1}Yay_;,Q:XzQdm=$0422>Oq47B:RAeJQ6Z6ڬX1$dSw;̯JRqs6cO rdU#u#L{{\rzqY $i)$!9pEJŤh 6yk-8 LҘz98Iߓ^#+i1Q YLa^(e7؂;gY___^Nʺŵwk_}. =6}dyRI s/ngl5{ tI&m-A#K"`A5FE~"I7Ο xPRwAД ~־q9AޢN @Hj8ӥ, oc ]q" AW`i$RTHF{5:cԅI޹:?!R[0:kC0FI}0;TA(/="HRh1@㽒 m] }^J!yqDyK_wUL JNP--FJPPSFgAX@uƕ0k)d $)e5xZ^LΡǔ+h;d2,y-mR+R( qks/W7TrnQ/@p]`Ix< $*!H8rH8zrX}Rŝ6JI pr%U%(Uk(ҬdGJEc+GB"gr _+RHh* O Y Oc;LNv;v壩Ͳh:{r)snwϞqaN\6D^dB7.M෯Eg˭FNpHl={ Z'B$55!P3'}I+FdXO%DBY!*O%"9r`t: ;qSg:Ix;M?{WFOdwew"MɐlgrWlID"q2QVꩧŪlL tp6icJA-HamDUɔZƫGkkaA/b|TPƷ鄰yPP^BǗmB EeeO1AQ䓑cA)[ y3K2֥XG3 9:usL/cMa*Nf_?tCBŞHMAXlX&[wDLj>ImUڇwImUJoImpR^:#\AW 鴴CVcMu:o5v9t 7Rh=JP;WuxbXm]A:eSCHb,e$Kc!l0ʶeikɜlM[gJH: cdʕ (RT[ĘvC9. ;3JBCP#RB!J S_fq~qM{I%?Od&Η\q;Xujd2G0?F=}etkhu| *D2ĄTJ"=%YlĘl,|PrWRtYk%3f?f{Әʝ]ٳU ~f/oYXpwq}Eov{;Vfo_`vVΗ Y /fG!YXqU*c%dy-yK7 yh9nٝ~zm6?OϷs㤗]xOO{Ws,3oUj|z5 -_wx[vAqշ\7[p8׫cg˖oOcѻvf̴=RsDi "a;Y\HѣeeCG 9Ԙ{ 8Ckpa$1J EƜuIcIj2U $&epߞ=dꛭY|0x<>kɅ` QtҔP)mZ $磵əoC&罉Ұ1k-+eH αQK$\jlc E%ii^; b#@+=9h#uVr* u]`4̱FFLWi4@,T1  朝Wf=˴: t<{CD 7%"Z&G!K`]}2֠tmr|jH uwM06*Opl'/ }FoJE$#v3qF8V9. c >߈#5qiGV-|P3l#KmmE%rP"G e,QYd82F!R$lCĹ~~ CAfq("ƈ(GD Ą'Y$ (p*I ڠZܘ!K!Ɖ`"wRcc\k[}1VYTL$lRHZ!ǐVbR1acDl&݈}Gʥ!:iɡqQ8M2 %u2.d IY*Cf&m EPc $J~ŇHkCC8 w2*sn~|G:M~c7}2%ԝw:@ڎ݌"{NGP%8B?y < !}!7A{SQ1eo%@TFd&hQbSf`sYӖIO?L^~=\yG(|1ᬾ O/{IgeIbvIy Զ&QI B!MXZ,`GX6׮蔗>,(͆FQ&J@jlĹۈ/Y|Y$׬ s-NY&CtŎyne<9QPMmT.r Ք&Q<1I0hdtC^/DZ&Q84Z!Yj3q(X_t>o쐦ԩ'<c8pPPhR}*93i}8hǡRR-?5&Pwp ee"Y/ yAloi(qb|6[h,IYm]LNٖ樢S&#\Dd?申%UP43onb KʃWnyecmt*M00RI2erZt@BH@6`C)3ʖdcjrmi3N=m۪[Ac:ҁ-#VvhUq;i,0СiUʭyд34 3<4|7ֻ[Xj߽]򻒱va:(]KUٍwOcjN:iM/&ljr@c0X B٦K C/=cVq='Ը'0Smwun몿o?MbC嶾mscd8)j"*=qU\o%2eiCLV3LсA&Adf_f.v̧+0$N|po>|N^o㟿 Kk.Kf~I]諦p09=V[ZDUϪ~ӛ9Et;)LBI!J ͕WݢY25v}9XD VwWZk'?'P'Ǵ);X8&Q!^}뮛*G&^ `i|lJyF6|؄ղ\;D \UiqOR!\YS&\hઊ]Ui:\(\s+g3xઊXJk`pUt03+oc Y`mF5vp)WUB\9t>veNNbYՒ?~Q&߿Y𛵾ajW_7-Fʊ<kRw>ѡ*bgeJx4/^K{ı\zjeI͟WoҷJ뗹*>Wdk?ۛY[" Iұ8u*夬V h2`9I4y #ܺ`- ϳ&{~^= . vo? 0/~O,w'ķ_SWR SᎴIJ(e_ Hyrvʄ)yyw(o,_z)O\EX!}k)/y)'('b|y| yq]n2zV;PeԨSrVH"S]Jv/s2iu:1 3NL,:1E %R6+]rFoMV]" _-!d+)h;xMhq" ڠ)Ѹ }.z!RV5ƈXdj=x8{ Ug {d=Iawl6ko$rGۭ%WUB%@/#3)H6-(FrYC,<j#$+HD!E&ĜtD)rh|-8t2 J[/$+rmT +-Lj`!s댉D8{YoWlEQ4}^j4Dv;svZǐL\2Ad0He4&pT {#?f%YgigDUTHXl|63 ѻj5Zl:IcMfͮ AYmXk)P)8*CȊlN%}V82eDi38hDeF#Jo;Mf{>:"B'(]璑=v! D|Qh*-eDXE~ˡ_39mza.{0eCQYg0ALPd$(vl=kh㒌]u4ӐC[>$:!]H{Dve$=n俱7h`8ݭ0Ƨ+x#blq:- S(٤._MJ1CuUu.sHTQT%O=pr/ܼ(b^$ǡ2{C_ =ٚ>!POr}a_)}ۯ^׌a8Al -0gpHi)!4GæWv|]^iRŷ8JOYU_=k%7dVeZˢ-ή;v4TE['IЭ25ԄjR48 l㹶' 4ޞ}ރcvV84BI%3w;ǻYPsWh&uul< BFT;I8&[bHzp{I%K mn955,4[̥޲9_e8Ufvr= c_+]O|5C[][%;tŶB a7|:}?Ռ׳*SBb|Ѕ"K=N|%IV`|~/$%i…~_қ($I~rgSSuJʕC7~~| o>>|!eМa*9MՌciW 3B SST&v|8|-$ ݳ*"y chj_"8BK-hM?P޸Ih<|k=:q+O:`Ū6R\2At&%D"2Vn~En<\3I0@4Dj5I4Mԫ\l_[3]45 d 9 E),tm-%mp8x.͂sXǟ.4 =]Fréc}}gZ6ٹcl6avR'PqY%5r)B[hUYp@G yK&Ckҭ6*U1gdJ8[.ܷ!d_~BrBUbᐡ6}Lb+$E8LAK.Şs4w{u|_ cgǿ.gj:(&Ď@Rm49Ѹr)r-`ž+VO?~l_7`l;S_Z{{|׷4WG˃&l,\ֶ&ݬAn_˿7VF%XO<ϚO9{;I f7vG96x;k5 C_voNR_xm Zůۿ.⺎i0ū: g_?8Pd{fU|Ec/ޜ@y.=aXVuӬ毗ƟpvdNٜ9lVeǩixiV]2? f2ndO{H1>B{]  }?5rAoWp9Yۃ KnH1Q X/ΰU/wcIKWra:`m<ȔPeE]UrcbS= >ٴ6C#[sxf6rPU9naubE#4>MXM+93ĭeIָa&Zn I%YiM1KTb2FW6yh1DkP5[QkMdu7@ 'f97c|E0t Όi kPw-`tlr5-I%g0^X1{_+l5 I3ƖQ. l>Jl!vkr 0.qt? ܼbHԆWy{Jc3_ ߙ0*X!qAbOq#Yz“qbXyA}H?cr'1Ѫ<HK`gKSbZ)јCvL(sRL%`E&d]k@]0SGRaYH!hy[a[grD,x&VPgM0"LGo1}Ը40:--ZAn38 (g`>0o g! y![;8ji÷!S{,qGYF0 ohJkbv66۝B$1YpQ 4#42x u`KqD>7` uJD%LhcIX <) 䄸G/ .B`ȃ%J# I * !5t#hn= %eDsAYDhcx+(nGf6 ᥻BũLt#;ӠkrM`;o8mPZ+C:N2ub|~5}?z1U@@"QX%@wAK |E)c8#F^]J0F!v ۔gJA l @HA-HkmF-`k`y$~*g @Fkc/yBr 0X`fS\l`:SĄ^P.Ma)M6`6 @Dg;ҵ?vNY g&,BY̓x>D(eU֏қ#S*"R_ j)7U6PB5$|#/6JdG`03 C^PFk~ʴGu CdOp %D0u1w(hFPma6^ `w-> v6xY:7 knZsz"e0ښ@Ƭ㌑>^T!=;Q8rHE6rC=siS"<ܝ*2XcH`[ HȮJAz&`=z;z S%2ÏhGTPO_.w`E^qH"6E[NnU!.XaR2 -8QpnȢ !I\{!Fr׳^ u3"ZFCG:)nILKT19)) -יڨ@zhY@5K8P6ڠ@SB*!@?!\SѮl WgS녮i`М=ͅ/V==GO@ep0-N nr`-*-Rs/ R?5@*\#,&yU]+ &Λ5 M$ /)vZawx[YtH} D;m(_^QM+Bs~1xEh`WV*_^.sG'O̝̝̝̝̝̝̝̝̝̝̝̝̝̝̝̝̝̝̝̝̝̝̝/a'f_{cBEkʲQ(,[\*N @輰Ē2Q<Zt0(== : ",R01`xt nRtf@†hLy8g3Ed?遞1fztE~67DF}(3%TgoDXl6ZQJ0w:Q8pv֜5Hbji%%@;QDO"9pj눧0ՓHG!Zi$c0콠"Տh8V2vءd9%cwJg,$cY,|RYu^'͸IQTmòD 8\b[cQTA nIN) @ddl ɬvB TbSE#nHƞQk %IlR!]=e^e) ӁicYsKl;%Pvg@vS^Ym񠉥P3VhLjՈpIG:" i$, 2:MzISy*XȠbh k|@N@8/#4cyYsʨ߄dC";JDܱDY"frc! L7`Yp `'y{5J.-;5gDxU[a|H{}"X.,\x\kCs' \&6*-Il;53G=y\| x(q('@uEVMNp9%q5:4A2tth%g%pzsgTc}t+B+`Syo A7bTXP nS,⌅D~6E淽W=^mo[ 5e&ZboM"[\pO&#""TpdӍ.B:9;mOq)rҦ:{u1胹ч5d2puƱѝjo^n̈́?n&֜SQMD]3.#%H&H*pe >xls4Zg0X)A㥛$ݴdEO/:|쯭,>nvKj{oڮ^9Wsx^9Wsx^9Wsx^9Wsx^9Wsx^9Wsx^9Wsx^9Wsx^9Wsx^9Wsx^9Wsx^9חËSL(P,Rs~) BʡS,R+ /bA@[ S,V4w8XbXn ,8+ea@+t-!c;xC9پ'90u8<ia션aĤh`1(} [v8H(1nӻpG0:수KӃ J˕K_j@ϗВ)evy;bdV%b Bo-eDs μ]QZӍ<t`kV^aL`a5$FɇAUX`smK=6hCc5`QCSO]V&<Ɔ1L*(eZL`)G;xd#ZvHH%R)B)Z /]n2j\+#"^QnWJVpۻ2 iz zoOMy$:>O/_4E3Jx#]Pu c4I֊iaJU#35wi|~|_]x}==o8YV7F+]m0yݷT/viiE997yo h{M鲭^,w0}+=z1v`k͵ֶgXUe}G0|O\*{䪏po\ɱT@ 0&Q6՗{npu?÷'|m?LԇO>5: Qkӣ Ӏ{UɏWDWUC}a|P&]>uԫ|zOXdVjQz#͠tw;KQҦŪY…tjkF>@a_g""V}wUs! t6:L>{Hrėm6Ҫ6 ;<)W/N2t:-#&Z梐V R aRcv3m8D^Q͞q`- x*J9\kп!4$gT#jCXNm YaHu:9)P2u6&st9l^ރǜ8w~ FOeTi!!!!!!!!!!!!!!!!!!!!!!!@|3XǷ'nş1W4h%ơxn/kKq'2A7ðr\^ sZ[9 43D0S9<[f{USfNl8]%7iQ la% 5!t;yx];z7swȗ]h+=OE?h6 05XxIE`QRY&A9pk; vX#FuwaTwg}W  `>H(ad^M7C>`}qJQB֘N܄/7lE~;z_iDqVaw\0^R;L=t1qGMOO[^OFbJ>+/f%(DhP(C K)(B߂>+AQs)A乕 lJ,A#JG'Ϫ̐1 x!Q -Q$^*LzE_JPgٮLoKF7W}Pg-wS17JW R__/[)+Ef;MzhsuQ"0iĞ&g\&"ڐ\Md60wmEbӻ;Ū^&|V"w숖ʥ`>V W ׻D;=@[nbÁS6k+[G9o-wK@p4O?Vf嶂4{@ E+DM+`}K -לU%pͅ^ix;OpybV_Yu֞uSd1-B.ng^+;eiY~gNOR $ߎ'S^!:z]P6]>B:MoO*``x;gxuR^߼-ctD~|,?^,wfnzG0H(toڦJ@M%Pr_׮}zi̦*`'gϾ?yWuUSnKDvo0Q;dPMϾ$0Kn` ͮ/qhβKv}\{='w_S7y,d@y%"=zcKzB[m-e⯊uez[} VPB:tsQ Ky=& !U]wcԉO".>iܾfsBKpVX`4lv[*utк +~ZNojRYs ۷lb֓sZKm%0\7ŋTx8<@ŏUWwIjƐ҇ovf:P*9ښ[(j|+wώFs9ߓ`2~mӎfI! t5vӎp '͌Ej6jS?hmnXMq0__"pi.c]ugKMl5Lir1A+"gØLh O qOfj&LzouֱPQBE[h! (:խb si{Ÿbàz*g '[~]lߏGҁZ:`$@7`U"8*6A?yr9g.K6K>$7Wz(1LN~yj[5O-sS(1nRy7U;E\y]5g+)M:݊t^^Kf<0J%/T"Ye%2*eO}*o#|_c!gfnHq*N #X}uTh%]3U8E$'|JT$+s Ae zjM$J-9:ͭB6q[#gLJ/T<ϺcvJX+ ϶X]n;h]0uU«B:Uz'ܪ2a ܿƤn?srdPkJCO')BLui {>'?(ew$}gDxH ;F[TxjusԌJ3ڜXZV9K@}Ue:P+$:KAS;@(z<~&p!ʠl=R(mm:i\, B0)v<:Ln^[NhGh64>PX qW&]/dt)ʝH0{1] ^JGB]Z;Bq%x>&2Gכ.΋'2R^UYF/,2iza><7f Y1f@ VR \sF(E`6cPB%+cOqNAH 'E./ƒJW Hh& >,wu9{9 %Clƀ+?=0n/sjY>}˥;*mDCXh~M]8{B *$H <血>,K;O'36sG=@V qsɥ=ATv/xRt ULBI[w_"jDP+.ns9L Q$$IRI)Ox _QwGI޹8$7BRc7e V)#,Lj7>@"`u2MS ΁DSԖAMkg@!*]Ѫ͖mm>wkV\ss #GSgCG07#\bd@l&A%qwpPeJH4b$Ȥd Kizdt 2|X>tp'iczX s4@;zaa,Ιܿ~JeL̲iH(/ }J=+;f)7 uMګ`׸ȕ>tlzuy46)c N|V.pgX*6#pmFſN[ps-kũUhke]\7i Ғ~ JX4 1ZT`C(ak"]{ 1SU=:OՊGwavԊ)(ZqkRF{(t SAR4&I3Et'fA mPq!•YPe2…Zo(m/h-o!Ti f 6PReq C*.3'MtRZ-$m>MךMP%Q,3q"`h-*9.:y-kG.Ju/AkI{aqVza>(_(3M;KD45Q!c*>p_i&-{?2\ ^͆Ԋ B6ٴ\LH0gbU!WKQWdQuUWWoP]͝T]*y)PlRdjm u[q9ꪐ.F]jMU^]AuluUU*ԚΗh-T^]u~+JaKߔΫ&g0YaKXiz2>>17FSqS"y9<! J &M— ̊E:.c״9~7X#]Rƥd * TȈRSFq#Ry]LENATjG}ː߭7n+5yy~?ɿr~$MnONM/H4.O*(2EWΔ#X`ZQ+v,%"$j.x+ rPlHkFu̔ a]P;-:#Y0@*"JsNx6[#gO\=餼[ yCP]0j{XQIk Jłut~]fS仿Poq4 QO~1/UrFiy<S+ʳe6M|*tesڒ./'A5`R$38EȲGZiW,ȒZ/T֞S%mMVfc(L|1kO]flQFC/bmnɎ &rf;&]ٞBs; WGInmhݓ7L{̼2r;?L-ͯnww,yG_R?1qSfuf'So XC ؐ3vL:[4 Pp~26|J3{Y==Qݤd9nf; /SZyw@QOIg1)R!#מV$h39d+ڮfd-ޕ5qcšYJ-a?(ɵqraxMHvRh!%QԲ*K$N΢SZBHƱ`fbPT!ǢR0̗Fb7W)FƖXHa,4=>*^:͙gf&XWs/S?Mw~Sh8}#se& $k)hV Aˢ% Ely Yٳ,&WpJ`&@Bݎh0Q|Lă.ĹdVT̮vfGnXa\Fb܎#>j8[.ui(ya\d=.9Z}Q镶Qr\HH "%m[D99LĘV#ԩ,N@u>H]@f<ѨD2X (PVeqbP<&ХĹ@%-Oif'sζr?ͮ:!{5hֺ(eN I{5eY%1=_}& xN8mZ-) JJX|Rl( R|JJo"]Ijx,u\E-e&r:⨕<GHBS`"W:J&S,д47Gy,j+FgqɜS A*"qF)tj(08h%:͓ej 'j 4Ee*i#PҐZD-N3=o'|l2 ӍhZ ܓ.hPR+45&J)vR;O (lQg}¦%l(C7&HhЖ Q3ڪDL\Z:-D ,%'8.=62+ !,@L"q=|Τ8?]4'cp-pp>{iy'(OuYD$-{?H>By2LO{0R$M7$'ⰉWq푦ygsX]Qț2$'힒#PSwtOt xGjꜴSJޠ7NLO@Ǜhmjq쐡Po>Qdz,>p;?OpBzjc285=][_D]c`RhT ӭ Ø,n,vkq>Rp8tOmASдa%𻴫KvO\d /zq~8qa8mHmS|ز\5󟘏A8 +(6N&-p:mT?]s$ClGŮ vtMKOb{^lq=Q.v@E#bu SIO ",y q夵דhayU={~[ǃ0`sBJcx4\Ҥ#4n VhTL)0Nsivؘ+iton۞|闗7Ⱥ|V;dTlfTdY\^J,|,J>lE@>Z>7<哄ieGDX7i7bZJX*gHNydYy%vV4 kYyOwM(,CTV>ODj0.zj̺AȤ U#빱1$䨭I$%xZĜ@劉s.w:BX6x5őyz,;rL_\ 3{rŜKnG<_k-EԞGS7G>ލVgJ&NRH]qDr*oF="AI{2Ky4*JFd>m%A|JI(22 T6[Iѫ'%dDʇ 4(1B3CMLɤe6GQ)bi j}iTLIb[gXTk̍/yp;&\k+.V䩉̔}]>]0(&U2FOO&H(I,cҚ%V [dk3Mm*&j[,z0x*m^y&FR!EqrLJlZE,T%R9\,DN@E>7U> gǃqba~>&-,u͏uD}/!%^& Hr٩J%`HN'! QN&O͎C=kG y4lG:IԇUGbˁ5q\BBP$ x=c=l<[?h٩g#{ό/_ѱRF)p\*D Zb IJ## VɃ /6;CŚ9{WQ8d2tP"E5]@,BK%@7C'}aE'=E@ޝ}6A㣍0Wo(}u%4΂TuyS+an[&05Yx.Y5,kʄ˰hd, ZN6G?! ~_qӱ7uH/|㹭pn8S+ i^E2h^qxȟOv:cM;y7&t4 tZw%~z/ĶX6c6ʵlq>o1^l1a_a\`KB2zQ9YFιdK*&) J?ukVYL.ƍ_ktf?O&ߣRMPh[lpZqd MtCu(7}d:M۪$jԕҷ׎[\t>Nφ-ܬY @Tk:-u%G2,6y5mj~\o) (lRxm^x6怀9x/JSJEDu;l8sXT$ī;'ߝ)>5 0 ۑqJJLdsZ-rNQhRJL8#X4R0p=<}::!*BeGG]Է{uA^yrn9E)S_O꽨ɋ\Zʋ:mac6|*S;g$*aUx'"~$iqisR?̗i`RG6G 'J#Nw@Z&V("tu d:Rյ [*Ǥ~jBNf(]&ln\'=|& RF2"+KI~ʈ,}ʈo0eӆqO?nnf{W#8 MT̀~p^KE*eT9uwe, UYǤ\*?")%"|H.u@Rۍ5([鶊90T`)ILIZgdV甈Z9\TZi8'cj2tnן|2 f0+:t99:KsLA5NQ@=IXcypEO"SGO*za˄7~y8m+]1WC](oqcyB:WqiT*-.'O׷^%qXvn+Y }n%I)SЂ: "qL1QIC(nhiZ{Y!RI:k~y}M{MJMpHôW  K,?{Wȍlg]}Hf$y,HiFr,ۓ[ldɶdnYjVYux,V%Ӊ2rI$ (F)2!5,FKӪ`-I)M3f;i3& lMҚ85zr71Eq)%IN D9.J{$5HdQև( @erb9t<-_j z`YXHyHҁ1T8J6}.ӿV_I?P ^Շe&7q59Sg=- kH/ n~QijJ듋xM%W[[UHW(N e[F:DQVJОCV b4E6XAYUJCj{{IԟG#:(>>J_18cT>:84LjZg<7酔/~pAX릎l3/Cs/JЮV;9hOH3vC2WL* H$NE:e)2RHrgB)SͲ lrY  lJʕ36Yq% 9d1##=r @4@)u4^er%er!sB 9Pf{+/岂-[ٖeڄK[O؍? { *mn-Ϟٌh@8G8hxx%(d.nlD-==d:tڙgٗ[8)rVA &FU45&$ɼ-Yklig]<E-%% ZEɬV: Mi8'R'^&$s%g4 PolN3,Ha.ൖy/]PiBpZQx1OHu(KA@9DoЌ[V JX)cvI{h8*Bk:| |?f!N&(e֐ƁfmJ9f(TYADTd {kkoEd)b`lfUUF׈uf{$Vw/\IF>USܠ]ߑ[͏y١;>ƕ=OJF^MJhlR[6ĄB'^qi)Cd4d؆ߍ$WנݻR[4ʍ5]%.̞\+ b8*du`&,F6qϤf4{ `&w槛Q]ww&vKƩW?U*6]6G}jy\W3LM U{jZVig)֡("%q]-.hoQ1W3k/QAͧ*lTj-AzHS|0}10gۯ瓋YSrfhSjAf=֜L;g~|uZuuAg`+tW`HtR_!LǍ\Wr:W$t+pEj W$q=\Fg L;;WtHZHJyWȳvgU3':\)pu[=S;\=H`D!W]=L}m1^?i 'rۻ:x<we8<Ͽ4Jzw)#go7 g7 R~j<qLs-_yGC8v^e`n~|.z}ٞ/3L}o/hr[ߏ-EC}; οqCܾQB [eB;էflgC_үɧF1;}Cfr~p4ݞ<u0^5ѩ@ًmY~N-0Y0 |Z*@>zsANRb8rrËxbrvF&G^W^!5JH(#*4JUMS6ixHyJS\Cd]M q낔iFdC5j>7'@sC5ytRpݿu&?ϓُ$|@&4w?Tn&qquhiS]ZX3%pct9|wDSg]O ^yicxXԴ_Q=h"ZRk!,+wn؎.}؝5Cf{> =)[sC+FIy\O%h0;}ōV4^2V9@0y=iSZ2{|;t{,ƌ?Nָ3b1em5guIb`,h5acˁ,a4@+3[G01۱)o:U+vKQ  Jv a? U5"{V촭=:}>UnUTr埬'TX"91X o}6>WZjOQpD+[h^4=F!A _:QYȄ^0P5Kû"(e|qT/T@Qefw2⊂$L}Ϋ,b1Lbwn5r,OHCމlp5UAFiuE.yy%[Y~jP=*ZPHs;_ܩLvM/\1y8Q>ubN$3P2|=!P'R\0tE$Ϩ[NdoXJ$vj>b ̻ jXO7}2Te_8)/ۧSkTk_꣮!o~͟&~~JdzJwQGmFY$>,rs]f"vŶ=xw z1=ϞN5oȏ.߯?x{}>WzyMO߯g{#@¢6U]oF}##{}èÀ;Ma)jyFJX%?>mzCwT%usFw$RQ57:R,>ub27DNWֱzQ'cb$N/޳8ow;|g'~9a8;z s&]y,T˻^?nj-oyϸ4nmk@Z2_X?ml]rm5W?nyWev.q};Yi/~ AOjw;h$IAjC@+)! lG|%GRV*Q0W& o촎~+6K^kC G$VHK[(3i+ *%d2" C&o2"D:44TO3xRjYdz #b)ũl6()e״Um_xM\yb\%ӆc8_к>쵁R}=:H~E!Js:9,ȤED`IS*!dt`Gmk5e|~mS^^ jnU~|;7o4s~Ӹz7^3Ne{"բDSTD:tu%Ee sU===ٽ[|^\{&SefpZ>}=E=͑kӲ kD)0!! BzYt`Qr;F DR {aڗa}q=KȓUo4>'|^5kJZ9Lnr w{w=u2Ա83EfEcFVnï`놡>7#: rR7GZi\YMbCɊNw;4Eb!PI8U CC gOHB2xHJ-uTH. Hd `'hO+twkR\uPˮRv+]ӑgܟb}Mv|@S sD uV y`^gf u#7`mz7: c7ay#Ȝ!us>AW2:2uk}.Nڑ "tp$8o IQS’bV ,T49[I/UWP!;2*cEK OFRy7*{Yb쎣EPc߶tY {gM길7֪4藽zwCeJmuXR,(I`@2")2lQ(*yW~q}]Żu1ňX- r ohoi0Eռmqj #!w"w_Z-!+%$KB#C7ރu^9%L-_omWӑx8l^BFbOFTQI#]V Yb P(MD妩؀(#?'B8fh1%LF@XAMKdvkX"v) DH6 9(+$hRBn>D,/,H@iۧM2'SLl>UǞe-$ (^P-⭳ɰLyM_͙zn`HZ B_ ޵Prg܁s̅z1tCP=,-s-ґG.d]pTcH5sXu2Π`b "UVz$⎊ Z M)@Ʀ :Mw(*V:^ UgH5]DTfR1h񸒥97˗$-uRfr9"bRX4[Mr|&i njp$i ;b">k1=NOLa^(eӞDz/t8+e#;Z{>!=~YȳTo+(-g Ejv}aV{Bs<2u= WZd;t>q(jA!])FTgs[K =R<FyrA:րBW TʘM`%ak5"O33[,#R^O=+{yQS}e@IQ__.>Y{Pyi񪺳L^TT>AJ8_nD{v?U=$f#] @BQV.>*UmcINR/ފ+C()/ζ7rIدrRU,8{p傅$J81LU̴j-Y$3mQSzTEYal!i։z{ƐعIWgaHdIb, 0یlb\8ەq مM]HvY#kwr32?n>дW]ܾscwmgشFt3Zj{DYV-2DZmcOx#נ^MTmz =AG)8@Fسl2m٬*ǫ֮"4<}Qu8 (tCp0QozanDȨc D$5i0=A3K(?h4b2r%,MR[?AŘg>5و 1I>@29|j%Y^ԋ &97)'ZՈ1@r8R>pzK& d%v 9dлQl8{4eċ7͕\lWɡ^3{U/V/>Y}+u0c[hRc׵)A;|^| /%LWG.> Xc_?nn_7Ƚv5xUO{( kK^ՏW_.bN͢tu!C^M0$}s>7C nji:y4y7C-<~o, !ɹ5҅_uA;%OC 6u-{;HcHYCh'4"ԅ|!7/xY LtG*,ϯVb/q{Ƒe>;C%رk[Ji޵6詗 + oL.Z>w(7w{[mfgy&~.>h}Tnf;b|W}zayD~xuSeƏێeu5Hjf]GI?O6H PLMk6kOOP~?J5,1x b3A11KVIZ%ZKh3[A_r75fI"<먏mVr )o"3a`6h{:;@c;$gޱf.= 뫋-ԕɽm*3TdެI ~٩}%S=`l>azq]2 z،VuaZ>4)9Fz} %t.i{z-C4(SA;wChwgU4WKmoO-Io{=[#>Ǜ]?qj\%86-um3BtXcڜ(wtfٜ@k}&?vڜ(YbIԧv.LcL )YHlzȺ;e-; m+O:L}2%]ЎcKth6n\w'+N_93䧻p"nq/ݎi[zܹ^Mw!] I>=`?Pl кis|kT O%_tqI(_{{=7'+CQsZ4op |v%%ڣ<|놟 E[wnn5yzԏ,ӏ}ȧݧ?|7/3פ_w'##?}߷[|znaOw<>^S~u{7zZƓ<[gK>0nsƗ]VWa,mfδ-4H򂚹)p,zK4sSZ7sSJ E9t%}1RKUFƪw+лS?O$`0aa]Mu ʝF jenѕ+Wuu[rt%])nJh J)]uVSu%ʉ]9uV]PW(] 08SIg^+dZQIɠ`ѕ+EWJוP֪eV>ߞ_{2uW?.Y>_S96'vru9\+W7+̴6|nStoNo5︿7VjS{&kUM[E[W;t^9V 46R46s; r4[EB<'_WT]~8;V}Ѧ۔~gό?M?;tu{$#w!`cm‰3\s 8/⤧>!2aAr,&P\K&cGJ5Xc4A)IW |ɏtT̛0߄)eUW+U@d.iF/GW(m^WJjbKz&p1R\WLtJ)T]FW7TʉZȿ,R)]MߏCO qGWӀyi]Mµ,io=1KW(Cft嫮zK!)HW̋])-u[Ы֣+,P<+E_6PRʈUW+ԕGɭӕ[.GW+EWJ 'JITuB]A`T-a4\WܕuPV+N;"~G:/b?~sbB]X kv]klb(ftkĀ̮!:hUWo䷾+ߜ'"k.S0[1!}]}ڳjuu}n'pZF3qgZY/3pf )zǰ{i߄f" 1̕).Rr6}Φ>Ԝm91z*iF\)RPLТJ)-W]PWz&JqxJ2ꊣaOI]).Q)Rژ}tmoSu]ňlJ7P`*'Rܣ>^+oRJWˣޏ`ϡS9箦GCKw뜆 'h|!.]M`С/BI`])rיtK]Mj=9wc 26+2+0 RAҚVѕ)EWJːSuF]yA Y wf\-.A)ZyFٌ33NQydWg &8b EKDJ (e5Xa4A<4W#\7RtEQ]WdXuB]gЖ&L Z\r])-d+ ufmw3b̈́])nRt1{] epjTJkb1o”6pJ6u z<5.ZZ~j;_4ܸ&z4JY2tUWM(HW l1RܥwDKJ)skWuu]9Ăt!+ō] -֨+ +r+ Jh]WJl ut JpJi+U}ت ȼAˤK:- Zm2el`U&SvkR#6טpb3Gᮥ} )0).So Tǃiu<9Qr6¯j+{Oü+-f?ŤW]PW)])pA3)gF\i+Z bIѕ+'R\Rt!dP)GQW/*`Ts*n9R@J(Ե2GWөхjpX4ܥ'Ѳ[HW(skBtEUWnH ] 3Jq/EWJԶT(V+ D^+& Jpy='F4[돪֣+8BI])nJic]WB]_u];cJ8] WMJg|*uShϕ&!uI)LۚnщggOZk2Zt2+)QD&lo”c&VM.HW\,FW ])-e?j ] 0rt>+%M u3`AhI&Li!dP) [b ])0&,` +lջUsé5;LFpBuF()%aBաC VW+btK-FjFA.HWDWk=+%]WJɮjkQVW X +&w])en;T]GW fIMIg`іbfyE(Lس~sb~k:apn,cФA&:%v V0a*{Eqɕ(-g?3,h.+]bتcni:'?v ۝K?o\IVѴ.$ZlVJ_֨`#bI3L+b^)mQPV+HK``.FWJi}՜JRbMTE4GfNlc3kO!xZysMh#4M ^q KѴ`s״R&lX\s,FWK֕+댔| u.w˙Z\b+-MPrUW+U$dP )T\ Ji9dP)JUs! Z^WSͧW^$\ 'hZ;lfUXuuӔ-HW`uѕRtQ u%lkm#GE˙잶yìX`f3f`V"Y$;,"[ۺ8ZR V_T*V}UdGsE$]y,X8ʐ{T\jJwi(DVd`S~Jj)꼹2ThS F̦緟oINj~>'ଟNvߍn܃w+'p'=&<#˪\$G$>9&3;_4ۚNjX9蝧M9J'w{|Azt<ꍳTi+3ل؎FQ#vNyyf$OO ID!(>EYwyR4yv'1Z ;?&W|/77pa^kma,nXWߚGf*5VKry߼cf_r^b駀z,?VfYzmbx~ hcO4Ҋ{ju&7\W*h++w`՘r@ӫ|<׮?]|(;3UNg3x8Y]k<hwQ]Ov=3k Ns1V*aJDU%YQUQWJw}gGiIDž7tK5&{aYLdVf̪D4DH I4_O`~"1naV%Z秮B< `lI-~wWALiubXF)& q*Bh>R`Eݾva&1jȷnkq ĴTLrMzƵWr Q-2A7ڕMxh0d"姭PSϖ|3Lfs7ma9qetqEi(rS@hs N}&~˿zh|{Ds)4q[|!z p") %-YE*r4EV2 ųȩY.!2ej C(+43AJR]E6g G!Lg(`4Ěb^Zafh h%j |Jgd݉Rbh-@In!tԗ\cFqP9=bG5VM\8\X F/BӘ13N"^O C,P2k(bQӘ8jeF056JՔdY`V/pcyAhPGcb8n^*ƌAbV/)k#1+s8<&x1+sXˌ S0GMiY[biO e%,F˪*Y(˜HIU1y2jJZDm'ڻ]Sz]wC{`~;4 ~MUchfu|GlV]=5nmq!FD)J1HXlc_͌90@%Ǣ$fC~HJ'lKq$bzIEgѯ(_X9m)ܱf&vF r"3O0ђ=!ڶ? 9ƽI:E;8@[:9ާSHpš^$lA=\8-iȼ2eWI*4)+YIU;'|Q K4Q.a rkY9Ǡ?Mj,q }%p"%Ue h!t UIP%ʢ%˴P9hu0 &wlC+;)FLRw䩢*UNy!/rIDr`34Dh3!Dۨ:ShDŽcvrjQ i_46g3;:]~pg9MEi@0W&y1--D+J?2xY bp(i5_::~z'T1t(-|f:Ny3=j6KXm&=,Fc>ˏjA96/ oIee6 LcS̳Lѐu=3sOs70@hrև;DH`ƲW~Sw:/t闻Yzm |i\/F׻[FB$Xbz7:+h %~S:o޸s9ީh08 8H~ ?/ >8~hӲ|l֨PSP s:Ts;ON'j:{x0yYVxI>!Sw!mFw=@HS!z@VoElU >v#GTpF |QӤ:ּN FpQ+G[w:s7cG!ƿt$.iaXCS@߽\5>Ed aَnΏ9%=5ӻbВTp_J~)W4}dy+g8[C큜dÄ@2CG\n­oH8ثu~uDA ٻq$W&{o 63=Fzb H?$[_DJLwEETYr |#պ|/:whr e.'M=Mߑx|?MϑI v.)rN^_{@y\"]:AXAPaN==&QHPI1NTqss8:A6D\UD{Mc(CBasd7+jYBtNfW[^"d~)K,8PT{vwÜ ~Fʲdqj<Ĵג1V:+ V%SPaAI+r䷐ZR":66ҔeX00r4ڪE6]i 2r^[%=_ )ȋ:8g`9HHqJg/ l%T'lۿ=sl~ȉcґ֏c8~@'?- 8Wc2TH¤l}M|ȬOCe0RxCƒ8+vŋH"($s}̪V%O0wA, *Ҿ?QjF&73/dB*M)d"5Xj4I KT7oBD(n^n ROm5F:F?`yx ۦ"P߬ڛ dI2p*x{$}ݟ{CXvy_:$iO9 kmt{\5~cY}+(Z椭VpW~#XÃ?-}S=*hmcz LTN>$0QMr;b\ -nXv/ pgt*ȓPnH~ĨaAhpR{uA;V_Пut$ĔM{>GJJ+KɷCx$@'9 ɮ4^2LmB!vL.®kQ+,iHE۩Lޒ̓(,Z ٴy@ׯ?IeDl,SHs[\ ,h"^BB2Gu_~6ۡ*ݞR?fQnCeiضXWa?~ێ/_nTSW:+C`"",E`*)]߷/oo_mhiV9DN<@yQh;:oVLӪq- }Hv']N=G~͙ydCCs6oJ >8+0 *0j:.)cuG F뿣} Nrmɽ&'BEY:!)Ec(TOИZ;^`F J%H;IMS~2(]/7K] (Hs4ZKP@Z붎މƽ-|vDxDL~߉{M]RӔ|r6>mѩf~u×~[%xn6xC[`Ͷ}@REFk 6u꜅ >eiUiW(eDъ>:m-RUmpKRm@I,g#;^}"QL"wpypvu'aI]HFLD$uGз(eu2'^0doG[ _[),% y)SB_V{)%Q:Αbi EHJ10g><.Awi:Z)Ri\[Bصo*zh1{_QQ` F K~^[ c|/22-zwzHύ3#Gİbg\|31픀j~kS?skUӨ-atydb)Fv;v֮]P{h|?`9\ٿ%_7tz鷸?2 |$aK?xi0Rbk- ORI@^D=V g>·1 uKO$Rx*ȉ5ѕZ lRИ \Lj>|݀>zhєhx 3~Nčl>5 D^EFE>`#v4JF)=4ar*&1r@ZxKIE]@ e=ɴԂr4lڤ4flkHeȼarJ]05މvY E4mha&8oj؄nf!HU{9>uc)VT+|dsS-5[RTٶ) lPb %Ax/xf=gi&gemFy[oE9Ve}jIaf2ǖcIp9OmqC8'A)oiZZ"RHI$?LϐjU4|q_`(I"ّ|E\Nqykݻ-5`ES<Nh$䶳뫷靶 +\DKS"͔J|­A{o"[61p |xICz0gE\: 4w/WQNbmdں1sƜ{mh~߿v e bFȉ=]y9R\pG^-e\;=1{3>mH0 FaCM4iʰ@Ls^SGLF%5`t>C >6΅;uz! q[AqBRfImG!JZd0c)Uu}ެ~zlq+^jtRĸ,q(8 ju6aZIvJRm^faR䁺9AFy1x$vCbDaKk.tض3j Y *,O.荗PK tw7sH8 (Ž!VF1hךxլ,EY]nj ܷ]ֵaLBŽtt/I`0ά%O X I ϐ&6~]e7ӏ6O5XŵoLw,woPW'TJ?n֢^Lލ›5.MVHJa#є)tIY@_m] .=a)#`Rۖd .J 4?/@EaX|Vj@VK1&*:,x~KK,8 "luqJQ*$6dDձ+!ۥ@!bRm8ߚ*&P쯅f%K~282a()~]> c,hC7)#dDӈG| tcx_ink=u'g>kkG'D%4ŻBMWNQ251P;d1Ee J]ݩKTv+)Q]h*O!?;ndҞI{_X?RTs,n&_vA5+ {̢`h&Vs-=huB Xr㊕n#F&A2ΑƖ7d:m=:@ RH0 KI@YIК`۹=N= "i9j!.u<Eaΰ^(^ްchzSA<̐ ^r$!&`Q!(0ʚU h|4 mzXsSc%1tTu)$̝_{kCe=trl} `DC^&l4 N98gH"HEfhoOHf yP(FO wƍw}^Wt*A5W# \Tm:8p0s<o19I*:Dce {h|cqq~m;+Heg9 Q_3 vρDA d*+MֱĒd"sD~E_8E A0*UiYaΧ2Ntm4B7Ȯ-_:玽rsTv.%j@=4Dʃquu% (5X%JTK3LgINH*Y= UKAa'UZ (Q?ÁGۮٓ_wlƋw~\jm֍}#%L/ ~XFmA͜&<%\Nn?[]I ʮg4Iŧ[񮚌[5i QT zx}\%S{u'*Guq<" $^#Aoq[˻M=8ACQ.A"qgP4y?br('+T9Sb0Gƾwu;A'7Bz,ˍ7s !ƩsA@9V) ěvWjAC8j˧L5!V=h1ԏfI-av'4WViWUiaI4 q֜H4yA?ZQD[dAoBPYurv})>DUt# `C#HOJXdW` +8 e힟QT&vk-[hP7C,aRB+kH9\~"jdJ!_}`1lNoAՐ>cW k/_TrrFJV`n^0NKԠ^%mmĮNR 帝 C{D%n @c.! ;(+HQi7{fmcwV[ hA<Ƚ"@Y uxcJwUVϕY,f[ky-=xr  :kS}s-<JvN#_.VAA//`I8d 1CHLym%cԵ3ߔ id>wi|)4ZPq6?2hQLZ=|K{40*[J !7jWn?@K&8+a&pA7w FtAqj)Oe1|Auh^evLaF0ҳהVLpyDmتmz"ݰxA&IHT* XRNĩaM-L|H ]Yo:+FfrBѷ1/3ECxTR"ےMYSIȳ,Ӻ;aPq!?RN=54MpUN}os,jhY]>ZݜCGU/aXw<&rݍJʣd@#MGqKqi͸AľYCcP8bK ц+)/6Z.l՛[ tTzٵz9X3FU)Uau354z]0Wdb[YTs 3Mj#%;76e?OZjx0ͪR `^&&Vgs;Y0W;_YUV)fbfϝdӨmt3r˟DLͫ6TyzW쌈;aD YS)5U VJbVV(4ôAQ+jhp32{; /Jn\˵: 7H =LBY9AMAdM(cкJ;LyjQyHD0,'e[m"+ʆP]}nUzFK-(zF/~h1֝&M!<^+꧹ĵ2+sޯ+sc<ۜ9ۦO6}ˣ0c:Dz]'ze Xݭ{vg7\OO:p38ޤJp*p"w6#SIETC x-8>L5zeΆQU=v` dz jh;7|& ,I?q<*Q"ޱ0rMA7HX)T𓆠aO,,(]Ğz+XECcCu;8'-9 WD\'`_I]J' .J!nh`aw]+x__g~d1D\r6&MjФc>Y容PO3fu‘[n/#B' j%֦p"r5믿~e˧u|+Ԣ2owm?v?QcVS5~dwz˕\z}Ci\:"ZXX*ۍ*GskoLw䡩_7Icsa=n}G`qizH8\K>6EAȠ;+6xLIhkX Z=f[,s>D+ i9FKQq,(Lx:P WxnbpD`Zv$ݖI)IoE`\Swo6_j2,Q26_jl%GH"hC8+k Vji4R/M|)yt@b;O\++.Wgssno=ru>HwpnWַj1Z4b{Fi_ieyyn%ZNO(a`kavh )OD`q+OQqD1@c фclRy*AYD=:M°vPBz&T4fa|Ƽm6W19UZ89.a5fW/XYv>]s "mPq/k54{h8W˳=L_.BX+D؞b(2k"nHb>)qs&x@Y"V0Ҙ4}~8&[:D:zIURKṅb(Yf11|%yG/ᢔbhm-%^Dhs!)Gq^ y'rVRլ^Ri)sj uU 2uJ(ŘV:{Xk.p w|s"οkRJv]Ϗ4~IQcNz!宲1hFK~j%M,(KȺגr9SIhEʑ .uM1 Rqe7ijbŵC/->ڮDWJzς vxN,T( 5l2`jDu LI3<]`Sp"Uvp*=2i,/}9>U54߰ ;/>3qtoɹdN0_͒율J =FE+JY:~ջwWR6plqؐ=Oz0Ԥ{eD7'$uMU6eZQ^rdY +9v$0}=\l:p9Հ ɣQ hm=!^ )T97e?WEhG" >ܝ C|AѴ9k{eJQhnq'zeZݚvqHqIp*p"@sdG`Ehjha/egHG* fH^ BGUz*퐑#dx#ceи ch_߃, 0 u2)E ;T#`G8GǜRDLTG2hWVOj)S4-(]9eO cqYSIu{_HfԾ7a[l$-bcJER-XN_&f^Y1jm/kц]~{=B+E)4ݴtߞ;rGb(Dԍ0%!MCQk 9zr(BC"F\Zb6b/io;z /"vvh9 HOw`HAkhD?XGPrC/w(0d=q!CMLe޾q6t!Q]>yUaq#5(!~tKW( ;]y_,,G[,IJ1<5Ѵ\]὎[,HB5okPssiwP'FjsxJ$!_1Lˮƶ^U1 0b?+Q_FJ(*Ri),xe&kQTÑ5W7@`mX'lUK=Nk.CZm=6S͹clWhֽ2NI# +⁑msraǒsce4}2iBJڣMŜ&O^p^T{5UY.?6BA*RZ+PO0,zW-`+o` `)Ə{bv`lq^&|ֶk6Ź=4l#S%0"]@0v:@xe ,w:\. sQ04yjy֔y `"i,?Kf9޹{XD0egΰՔv(nm6TN5n"gم۟^ibIK;]R(fp|=Tsv] F}sQIŌx3btl"MB|D U,& a]w~axZ%u i:lsߗhdD, m~̅XX ZqUswI"y/!jk+e}u,ZLT,/R )Bi,FS2ʗh6Ip@&1|_/|ϣ2֓z  e~?_|rwWٜ,r8^Ws nJc.UNY{E k1_KO| Di%MEMF J]pwO)LR21 7FS)mQ@d<G&A^dfXul#:K_Z,J`@!rlЊ0I}>|je (ųZ3C8 d(:M~)#BG\4VGA۷L][ߺ}sF8D18PSP)2Jx+[u4X#g"Tif`lD*-.0W"sb-֠` ]{\~O`."Y@`+~SB`54#w%CK<>/<6bƚnElE`ArAUP xm0uAwc0mw0C Wssd9o:%{fACp8 C2fPQ6a&iQ J'"~G6* E*l - ZpY0CIJCġFK- @*K]  *IW+nJm4&AJ ˃54y6*5axi]tƷڃ(' uI0[eBgC X47/8)z`cI*!ec˸̣͜Bi0LhEҘ Kiy";"N%њ5t}]etoKf{g|5`=He)`֣/߮h)gKn~{iŒQL1=kpEt oUtvI*LVrkUXُSZh,\J o~fJ0ӱӡr['i%H(B@d(!  MшjhC̅lY~!~O"7 "I}/r'Ջ܋ 1C`Eq8OpRL|2 Z癤j`~F 8|LitFAdЖлǙ7&q 0 4`?+DF(&z͠\Ѯ5c踁S+-N|$_;^ʻIXXș#i 1dX`w%6 ]V\v^K~w X4!Lh=A:^OjgnSONoTї~k7O·oIF9EE8KtV3sTC7/:g)!!sz7%t7v:X%v\6 ruҦݚgkAC+ !N@'!CF;KHjc^;O9t:iRJQ@jBE`֘v|EO42K-l*TQݵ=&%"*90É #26gX,[xƒ9;a4y@N Jǵ[莀t^V}s=kOQ e6lU\>5')@ SDuDskZ~+IM%4kD縄f6*Ldli * b(BK G~-wt\ SR]UM Z"p\;-]G&j`߂bqaߨ(_L['\" !D%oHQMRbm0eDz\ ^B,m,: 뭞e8$pDVVBMs*Zq/Uڭ k9XA+5  `r8jTnũMGv4?rb\^J sJkNOϪ~Ȗ28#@Qpڄ`Ho؎K3:VUlxh*z.r 9?ɈB54jttܩ2ٔPjU Y˘,%a?1l:?ࠋ9S~`|;TC_q1\C(hehTC?|XWUjъwiҼ=s!gX}ɺmhg1a~@ϋ A.l{yExl0߆; ȽCBᖕQH8Ņsq76Մl Mώo+6Eozc2 '@IoCKX-h: iKȖކ+hOW}&T9H1* ;c͙%O7 q 尊Jnqմ+gjo~\Q[{ fR=& eVPocqLW,?䃖})n1U4{Pm $$X0#kK4Do c U9iS+@d86j/ 1udڱOKY/<!l 4Do`.)p?oȭr?IR9|džʠn*Z\"jcHrX$+(媩8m,oJi$:!;:obhb=0L/cO?4I;ɘU 1NS[2c5ʀTdDZ^֓}urJwsqu̵F2 Wfg , kht8-0cm5}q]t D6!XsePI| Ogpv54 N=c!,S}*XxJE zֶV:9ea #LӘ/69fCHX#dQW'cXybkYr 8Cʗv@bƂhHa4 4G雑\ňۈ7 g@rTnRDnKEeAGF|SZ=}ϝZ$ @(/ᐘ3?$F%Z.RqrRd0mtE8& ~48 aQi>6A`X[ʴR2_.d"2lat^y${C Rip@v~X˽eJș9')M}#cL0 0;/~6{K}ƶikʶ!Y@,A&lm(`(sXcKbR@sxyLJx:j ͶjZL/[V(*ϳ ʭQUy.mj|]l0yuGh  $2q޸T䠈ȸL;hK%?IiaıDIK&V{ ]EI"*ƒ%H<)NQ,8h]T +̙%;L" =DbY(fdm`HGErK0MvxuT4*p4lN'\p@OTY$A*!ER#uā*zas6x)A*! ڐt񨂶Z*hKg ,, Ww+ "LP\6y:i%OaX9xjo,%:~ gSyK8Qի ˋo|NRjg#y9wJ _*/-a-!6z턹e[Ѯ;ޫM2y>"Nhzi8m<esL=R6\p H_ztĴr%oS<#ӎ/[V`!weޚ=u2kgtW&dpWkYBeOְÆm\ɷl-Q])[+24*-wmGjez][oG+R}=w@y8ʌ)N~II=PնY{n]U]">=HA~YCH ްiyiD'OxI5׬:WBꕱBLrC&ڌB˵̞|;ؓȡ1SZM<*b+Ý捻# T`{Fs :3m3S);s`˯ Xf`7PɾN J_/:++|=E_K WtO,`{n*+zs{+Oׇ?3t{p+vnݡ>gj(@P>Cb+d3.W/q팠3TAdy i@d+]{u1fDS6\lNm>~F 'U!SW~ԵO-Jm 3z Chr *đ3{ΙP[(V+ae(: ^ηk0ae'elz9KځkS|I5XEBEBq2J@]]-" 6#9k2g| {h)ʕx8?lc68s *Lp9649ެ3~s'·-MY͏\8kɇ4YlʕE>Ey3Y+8ϳ h'#LքMh0SӷP/(2ê8Jr:h/^_]VS[ F=uu\--9:۷[q.ċ Q}i(%n5>[XOSΑ_:b`iGnhUz=!"K@>m7Jꄜ*%ik+O[1l+rhs>sTC[X榠#[ɭ!X񡁐[ nyoЇ-Z?Vddl2{tϖseNh8yZ.sǾP߁\5nCRi:P)oBBëC؍@MCaE}+CE˻FȟRj dߚTOh>}*&Xbhla.l;.rtٕ#\:kCV=3݅BҾ0Sly;sC1޷*vdgV֩]Sl?qq-}.ɢ؜ ?f, d`pT,}Q}3Vg:?"9zsUi6?.uYa@{z'`;]: * <3@̀,0|@d rQu{6PN> W&q~]Αo?ϧW?~_W$Íó;*& >AG_uGG9? 6,̸[&Y }siħsEsvYsEx`5IY-1D3Fr?BdDϙӎj 82Ƴs= %=WTn=0.tZAMzlFy{nI<^P̚ъ>yaoCUו?;0D:8)D}!u9)_V٣HwY-|նs:JMcR-Ⱦ@^&(Jb6n7~P㷂i#@@2ҁQ(Ehz=II^M2;湗Hht"K$%"BrQQ$ZxBfޣ⩇ꗼJNoުDY2% fb6ogsLiҫ]5'4@d/nf/~?zsT|/={MѾ ^.k|:w,*z~ Gvy{},p~8tCwLiUa]%=wC`n{~VJ.J}>X 1 2.s|h9Φ@>DS UK16HAoC nJ9 X[X|<30LgC:^;W#ڧW~Z3 V:oMR.=`Kubzk'8aYchEdh󋋓n9\}B). +PE*"BRP\ ?c$gldB;$9(O(?b~'NYd+V;'4I@ztňxA/mʈFqe=dx:=8D`}lH#Cpz YkѾa׎>㤿3IMӢYkt&~ 4[C;)J :m==/4POQ%|poB1ť)4UJFFfm9FRt7rM;ܬU0U *Aه|Ah蚒 C0=C+h/52OGcx04B>VN˟8vQ6&ghPO yCͥ%p"$5qK=D -;Х,q]MHAg`|)hxNqMKY|_܆Y$6ߙbҦo\{St`3J;\n&EQUCH7(*4~>KfA( ͦOKݸc/5T30}`vx[!D0&Avȧs|_BPPr-ɉA@NR"oTݦ8OqfY*\1aJ}6OXq'AϠz)7IIWDX/ ьG(T$v$ё#"E㣂<&#IF;x3Dz4Ra$tpCNKgJZ>vcwv̇Hґ"Q@z eo^*A"f>R0s9R k h1[ !{Z:NI:Ntb|4AdEi(#@@M;hJBu!A =u*LxfAL1D@:l yk:Q':[PhJJ瑤tвQUR: h cANIlovHfՏ߽uIZ1VF:tp_Gy}t|$DV2)) FU蛣dk;sC]rs9eTrZLsEЖ8ɅI21;0Aq`sTPtmF]鏶/$EuNR9IQ7m5BX#Z!lgw)s܂B5NX{b))#ڼ4aAH( dN2B-t (!0tm2}t$'Bŧ{?[L_+/49=nYEW {oxV]χo0c߹.cj\ٟKt#o'U9)=JS"&"YTkW,l7~eXzw31E4& D]4ARH lcbj%I:/ką;G5v|}W[uDmhS+88ͯE똶W \T$DAcѮ ZBmZ]U2!h.k5d ԯ0u_׹r_mܵct&SɅQKc1ss H6:㸀gNT/J6>9lA磔uUۮӗ~╝&A_*4g 5*-99 cpU^q >Fecy4 4 &BcjgcO8Jmk}6c .d.=TR@q(֢6@XzSr4ܒl&B0rzV<5Jk|z"$}r< ޤD/zxgx}#c{g{}zz| gqT^n_{g5/sӹ|1o>Dlͼ&O_ԾhU.x?_zCY~Z~= dӳAYKbh?Vػ?{k;Qw>'{5)Sf!HpW/39RV)Wc5v&*2PBP/$ԄDb{PTк(ddo"As!&t؊$[kQ~9eDus.wu)ݐj𾎤(1oN-2 Op~V,Yi+ s݅ZC.Ktź(waHJȱSj1u Š2|' ۪7"t`4x< Fǃ)B+*QJk-ɤ׍>:9%&'cr?4)Eh;'EN/:bMܤUj%( TݏRX#2{T;O);C "уnذ3ca]tiu5htu)R0:^DV$iElNCW*1NJ. BJԸaEFHM&-:*=|Durg)~8eScQξ=/zh5.Q\AjPjM|W\_kޤ.t&ajAJOKEQ;c lkhb&̄3.3.1|nU*ȲjI@2r##.MRQ/>Y-D$:}5z(PS H&9~{oj5qzE#;36ʅ %ZcfM|!E;RZpT郱FbfṘǫ1 :w"o tIFwDحX.cem4:u}^ouXÑ+bcN=:I݃QH#c\NRoTKl7]<j-L|L]KͲ]7=1fSvqT:$CoT1o?ɔ10^ltxJ;eyOeɜw2o6}#9xOt:$9xryJc̡ѓ9zѾav9iO HA<nyxn0C[LG}#Ў#8_1sA. DZ 678ޡN#'SGoyYا`͓ESv'=4ԇޤ-CG^3tӊufmFzJdtAg=[Bf3D(: G2#b:GV^CA:,VHh&jFMhikF8&1©?>1z1VvGu͛?N~ut7ͤRhtX1eegC|,K+IeVpRvTiXZ٧$eVv~N}:>8RypQZgQ O,*rb4+{-.]sV>|zנּ| j(l9T}^X矟TSgYrg;G?;wxg;C=mHYdpOE.FYJ7W*toSk~~eyccpјqc iqNVA9Wfe auqʺq-Zs\w7pNXRS%W$pgZV{OKۗwx2TT֏KEgߟ]Z[owq(x EZ>9a.r0_ %E=@Œy׾^|1?<&2B-]cl|>% A&]7a <ނ.a(2Q wzp%{Bp] nG]l|4TDëǔ^<0Om+Dvym\\ՙE'KR. KDͻ|Nʔq`^{F%\h޼ե5pҏv:'}~s|[{*ƱSWy% MoO?c ({ڬW胎ѫZ'-Y.%qP3ADQcHY\0ϭJEzc|<(g1M-qF/:Aŋ^^x2P(:޻z+IߓmTư7A$̱Q)LD,qE[a] @Jj#Եl;NG>UF~"TJAO)b1lz7[if5Intߒ8+}ٖ4 #nA )T"Taʯ9|@dO_cCG+j:| i( זc. -5 n /ޏEMDe+VR$tzt Hqwea31lax{Eew])͔t6gh#A>RCm*OEU_ݞI&S!PiVVhه})[ng:GPu5e$l/9r+cKU;, F0go\ A?,A. W%V65Z"]`|yP"[ *x-W eǗӟ)"<.ϋ:9N9g5|VVi+χעSE:I+ȵu.:jR\3eP\Clռ D`(JP\I 2n)܎PfnA189tEʦ U$)޳]t wsAåqK5ijXRTl(&3j"{>p;r-X$xHY+MAo}"#u ^޵=F~tWrֲz6לu!M"`n%jW¢̨fF73꿙o[QN6EӎcI8YW#,eD[*r$ N@Rl#r2*9EOAU7Tޭ#ykrƬ|\8 ӡpdnyT=}ոAmj.$%UBp&lRe.ʯ.NW{'=AP|SȽLRm!wmqHW -lAƅ٧W-~,eImK R``0X3ȶS.?]K5թ7$}lןM-3nN&{4> h\LL^P|弌ɪ|L%^` %VzZW)[o/1NL뽒O{.wHD7Օ)%ߵ%'NX_/zQpnyڭ&wH5!dBbUTUϙJ6+مZP4hDԔmպ$PeL f._9_;7#`SmAa`"$Q q*(HIZQ'UEEU,TM @[H 33\ {mrPEymJʪD(bw"fxļI^F}`md{w龵ؘ P$)v/S/O?nc(;r:KUWJ(TCPTa EUrJTDͽ+'BbE `eY߼T ҅*w?=T|Qo0;s542vbw nw}sW溰ߕ^|z3B>~ٟO!WZ:9{T%8+_l7:я]18ȗֿ4='S7SET1nN^[.3?߸&>Ϛ7#:Û}?ӯ':gx%4!+ζc1Z3^5B㔠m{0Tve$+Xzǖ  :OچB`Q=L/Er#Zד;0ؕ0fY1mٺSLj ލ|k2G\/f,mU,J:נ_8(] fi@tS\rqjq$.3؏Q㖍s vP`M1 Iȑ~jcKfSmS*3 omU:P̐;u%tKBM'S'6΀h8DVBR_5:T F )LJɳ.N%|5YJLɤV䉜(Z)}m#JHˬӶNo++9`A/jM e=u $R4+dYFozAO[~\=(qo{o}l*.\Pq7y$*>,٦Ͽ/#?(-e&}'t`U*Lw;¯&huF.cZ7t!;~ 88uF6ihSLΰmh,%kѵ 1RyۜCS9,U\M9B.8Ud ,alhkJ* }!NͪX"A'jpl!ڇR3,z7a;b՗EXOA#9ΚG\wkS#c%|0ށI># mYD~m:~g)VJphrKшj$=!oqW[U~r~g!{]nAyfp:;:f lc˵Bٿx+[g*WL|SRq*2j[|_Wqc#7R$-PPt ZnT,1>Noբ?y'O5qU`jX9FqVMf}G΃;7*VM֤E57 G%wZ}GOad)OP;؍,}`wōJ{̱FT􏍆 S@<]ܙi{q2A: r8\_q G(b;?P+qR ˺qۙ;~g\GQ~#~o'KRQr(&D_Mr6F. geGj,y?;.B/l]1Ihuv NC Й3 8 wi̫6OFcb =86֘זܭ7-ѨNB=f|_O_]cRyJAjNH15HutOANaVڜfsDv\a, E+l3˸I\:.]2fF C!1iq40J Q5jЩ{HEgI1@nnd$'yQ[ܳPycg+8YrA]8Ud1zw_y:^.̤}[][rJ%xv.Xశ(x{q$weQ]y2>.40UIXr"rdN<P~SÏ灀,v/ SR0^1xuC +rj qp2F n,{(VB:*}W;eʍ.^f`iwA ZGehs#7f}` heT5׆bt8|\oQȠVA8ck&k(f L*tqԒzNPB$Sw^ڮwZ?C3JLAgvaN}z)$JbgE [|r?_o)": ݼ'}Iu`('6E# p ߥ\:{V#Ў" +\*B0U.|7r<-csn7<R5ʅ=F02=H8%<> T!/j4IG z6,7e"E{;?/g.[EcZnK@oqyup*Ӎ{>0DIw-S'zfeEZxyiO5 jm-[ 7z"*1{w$ڦKSo",b=Xè6J"/C!\hc^ f&NJ Z[umC\Z:9& Fo},#u/3[tP7LNqJp1ng`έ ˃ޟykE{RcTuȗWw1bf[;*9A@ Rn%zHTڸ\Kڋ]Fx<1S)Sԙڱϓsv() &4& (7Ǿ+BwőE[N.)r%Cʊ]?ǗZD2ԅV$$HN V>aۉ{R]bB]t,IXhKZLx\xr7;ecIdEyA2K(q?-BE%zM=zMC^S{h|Dl9_Qif6`}Gg_&8)JLRIM]EUwuhOeG$@QʮN FM&a KQh0F h}ղc{MbDLb ³y  'F"NP*mFngl~뚒2"yv_lD.')Sf)*O;37h)Zt4Nw]>p3 |C]/z$̓mjyFbe }xͳ#LAU{y/){Vsv!Q2fpRLPejdުF)~l';NΤ%{n6z3ǣbϻnw {iH!e.*h6Y&ШS d-MA8ڂg*B?lE({fe} C;-]_7ɜ՚)D[PS6$1HoT/c `{0N%6af&\>*=0ɔ5M&x˴YI0 ͖s=WQ޿]]//+E mh52E)$ea KYPvЮRLx\ui,WiX֓ȚI}C m%S;J.zJ0k  CF JBe4 I@l(nV4R)v,Eay%%CE8 ޤp!6^c[4ބoo7?BHNO#?xjseu zTR-o&3Y}iFoLla~o}}o||rp4=vg޹?fJfwRz|~\1S$(+(i-bIʔQ,hթBI cR5*Se`6\rlgCmM۞JI~Sj[ \2d$)T_2jl K#<@ tJ&#Wsg/?ˡ^K"T^RAU))pE uP@;dd"~ H赖)'(1딥U!b0 zC LUG"27zP{騶vc[l>oM#U& eb-1,Ckr\Iqї^,49B8 L;'\ufbgêCb2 |LaJp-/t5dfѹm ڵ;.ٛ]{,L9R7ڻuvw{DFMTv%Zm\Ӑ/mJupp,Zdz۸xfa@OW6{ zw]YEV|P`S.arY!j_6k  691S-^گ&l%s1*+NR;+ih.}&^٥ Hc._ձƨ`v.o ]Z{ms޳nom~a\>;MRum;@oTn];ɌD5*1;c\mt[=t[Cprn`db8I Vә0qnGW2beN26>hvZ50PO]oԯ: !>|z}7Y2cd֞pƾgڦ2H老YvHׯԯ %!olKklPǶ&yMeǯf-aG!%<g3Y?\|a StLf?i@e!wɱJvyoˌBh\_x>kS:(@N 0ksv{$1}K[HgtA]!%΁{J[T*e  Ȟͭ ՄC> ':aͶ{x !6>PLVLif.z^F辂r/)jx˽rEZ6{Cs(,"ůG%=nQf?hHVP.SyZ\xފ) k o.j(Y,B{l]D"ZnUk3ՠMNo5ゼ⺓vS< %gqdo< Mˬ2=^:sάYdQڂO?BQYWD%Qs>_I jT6H$1s`QKU]:k6Nvw-/v3MY[r 7PM )! &dLtv8 ~#gsصK=2Ԯ79oj޽.A@JۦW*@Kd3mF qRg [)|^cdq` =RCYmNa|x}[h FkſrsBO!8x. b7й^Tv: e{ݶ[h%{vZSkA(wp\}&+LP֫)4s3s3XcA︯mvH *ҾnTlͥ /;Cyx=q2gVJs>X4Џ|YJX{rwo6Oz 9LZ|g=VyjkXy9o#@M$ف`4^)\̂_x 1Wڒ8?8)Jrx߻ N.[owWr>T>2rxuƖF\;5Å|]/nƯL9Z3?FmoG]c<ˍ]TJO&V={c q9ɟ1t_YׂbP%FMDkQ15ckFYw8BQ_x6xbiZu6fH#M\yth[XVk^SjO @mEeڛS{;Z|_߿)1D#F想/'";|{wmI_!eoU-{lz!3%xW=EJCHpWUx5< u+d3$2] 3aN_^f3pWd lr9BbJ?iIT\9jCكB7u g.L+m& l%\>TݑCOOqxda7;3,|0m3uL9uNMdI$$ڹJoP|x+fO5" 69}S?+JԷLeh s}a)RRMC$ڜئhC\ z k<dA_eRO>%`no9=-?`G7L_¿Wb8\"Nת2HD +YҳYa(N{F2xjq=Yd9X7yc}ʧZF.i#-|Ou>)TDKFx*gt{lӍw.<[< y4s v !+|P)/@(,71%Ua 4 rhKz-6/S Zr۽>n|ju 1<81o66u#A#ULɫUiBǕ_T1`@#ȇ>6،F6/͘d^4Mރo,9a.ւDʒ]\j-n萛҉6jub:'ZN]r~;| z^fVL"t%V5ֶJEA:꾑.! $|P'-`eKhWGPk(7c*o5)RؾrQBe5BZ1FsT3BTh{z.&Ծa;FO; ދwFtgw[Ɲ-nx<eȂ%VI) 3:2fI%./tb[&Ih=X[\2І=b?.]]>%{t ;֋bzWJwA$Њ%WAq%.;[[f@ɥZC"R"զy '6]&%mT%h0,Z<ڎТ-|O"vXII(u6}c*AKnn|ĝ\Ph'\;L[m[I[ C͌.H#0\k$: 7߆"̓Y)DO59,!NGYZrrq$4l~bxMǣpngMT}4ϛ OcsMK%|7]U\bC9({ ?w,KV8',+TqX[lU_.ooÈu7[FtiѥeXGet>p%vsD'GEG%1c Kwed&njf"j }VY2L*ҚUp\qh\L>DDپ]Y\ prL剄@SkZ*5]( cŢ0q礉R (R]+BIEz"\o&o''?\LC㌒>e*Wr*WU,Z9O\mZ>pɼCsħw/>+Z$%@fNvdI1 4JprK",u˥uUGUVbVqiWQe U̲݅55/HEǦKFnlA~;gPV A[fM_SDg{tبޟ ?bT.9%]Ktg~g4U;%N-]l~:h/:1ؠs QcǪ4!X/*x8w:sǯvǯ dtVNyΌ„&9|[Aj/R$kS@k.Ih3^)59m؟GA4%C94?XAlF#7/͘ mXGFP YbwO$vGS/AATQn%R9+WKn~[nKw dvO\pA.!nТ({B3Ŵݧ+i]%DQBw>QlIss]Zs;\( ;o,_58bBS<CdxI gUg%V7|ΦaZX)iʠX4c)杚3)iҩa>Zo8ۦVۨ}3y[q#~lw2J owh-ãO7, KW"8fv홽[&2?N4mx|̒3g22;>mvʽA lEjio4 d x4mND+| \.¥4+=֪ x.~9'm hbVaGy0͗`y7Bg} A/eiNN=~z0֘`pf?E7٫> ?'z VLw og9"?=g"T;IN\f(V9CUP .y$>ZJא|HS&f)\@M\ڈFWL۾hBRH D]Jm#uyȔ0L5JBP%l[v7۸I^:`h!2)":D&jWG6n0,l֛}V\Sڈ#?=}XE{$^T+1e@S,)%hgYؐ*u)x;vؖ6_NzLhK{rQ&M&@[?q&aD=}mybA:AZMfGZ $CB:rrE9ݏɥD7ٔ^Jۋ$@N}*[R!j][꣩`Us*I;`J-ӆ%O8LHkcWd6&#?`V"҃W&=G)jExŊ X^ "9/䮯Պ1Qjria6QZ\Yt}|gk^yv:ygaOG'tK}6LUST x.&l}]WIDAꆅ*w1avd]4^Rs`(!3x,f3#88!y-Uz+>x 3r4={ʌxh\2H@޼M(lH%s Z!"D[,a2d|Vq3>&[;t֚8<_/&ʼGVoi|& ˼?=L00~|s|6;^ׯ.6/yS/黫 [)]gɊLs!}˫cGߌ⻹9~ul\2#1K0HZ04t%S+es!ɕ-WߌJ^`JrqSUJnUf+VuHpcH9x([} $9ϰ%*Z9LsctO vy}W5kBvrĢc<F=do : bns ݧ\ ={:+_ sM7l?8 F_[n\bə%Qc{,py$LJϽ؄=`T{Sٻ߶%0o>̇ yx@Nb$(d V:(YlQm,"utUwW`ȱ:Û*Qq:KOS5'C* T\#?6Q)ǶX)t[)t+Օj96B13I°gVf4"JSYXFMF.rl/o.\ЊiΚugʪ{'#K@|/eu*+h1"R-d]4F5pnm,]Eyw+oY }Q « \Nw`N MEd#G'ݙ?6! "NY貗&G9 YT `\H# f*5)/ӀOR@'EN<#1}K ""C,Z'S,UjHu q13S2/!(2=5l$);'ZXI\#IdBJQ@ikccS2p,)2@24OvTR8@ЫzqEOW`Dj!Ls!h)XUmO*CF[|>'U Z"_P .L*C*@ny&u*%҉ԗ[^6ZJ$_(!#]5îI #Aq|u@V=vq|5WJ/fJ(4"{Ko,sXcn΋?k̽zvc1Lk`ւ ϋ< f?7|Vܶ÷y_A4n"gx; p5Wqh^e]U6_5kQdzv\-Qݣ8ývʼ oC0PNv~z{3òےMƣN2ut~;J35Ӈ.1]n]ZS[gstͫc7hU^\Vv3o>fֹpfm[ʡYt6 *(sȎV>O, /^+WN2LTB9BY^NwbB3&llG \W}0$~h|85}u&(̱IP\N&(/,vV-0ƌg'8&b"2x8WĶPy<}6cI9 피D.SkhGD{kt` ZCBJwmifåb6Gq/ȣHwy(UؤhqʈY˖Ԁ )Z?Mav0;%,DKh(*Xbh I@H\5ɻ$"J)e>̏94Et@ZTͼ[u4LʧM~PSNI7=9 H;$ WbT'IMR|ww}.:j[$E&e"ʢpow}`ɶb@員F6)n(?!\qn/l^mD50/;b~]`W{óYz}yOW<Ŷ:ND!B`Rw{a8&Ԝe<\F/+⦨mKtVA+̯xu:zt7w#8-DcId9xS`.[=S jJa7yϿͿoV|Ĝ'ykIwYXkeaAYXAh}YX+O%ɿݒA]٪Zt|R:Zr0xD)y7uE/L#Ժ[t+×ne2\-dkfB4$bIB"<"`d%AK)"ũNޅB_T*g.9C멋Dd^)CS9FY g$N"rQfrK & w8u5 aKQzb(Kv$}5YȂZ|&;~5!mA>zyH&+P"DРa96t!4:KPi?db{6%1 LN^:;!-5" p*z,DKXI9ب(`-?ı}lE`k1J &;y^xeC7ቝwl9D8Za\ZfwL{OhDm$[Gףd;9WZ+}y\G -)y+|G{X;GtR]aR`mRYaGv9+:ڇ]ehZ|@q QvNk"C8Lp ;ZaF yaDH k%j:~-t?z[Fz02.gZtɼn-0EekE̫i2z͂)RS%S<"RJhX27̭&.G;(%@!V5KXBQO>x.P}LTEmdgAGrZʹDLe(JQ kF^>OMl 0V+:>Zk}iN QDhM0T/h$n(^ Ko0Z-3HasyN)0r9:@ԫJV *jӨaΟЈˣ}|-L ^sR|ݮYj![joܹt[݄R_I9scsliVVJYWs9JEPI݀KQ.nY[\=(ii=U>EVd#\a+,^ޝ7 -G˜+8"bWԼ||Ym8Jly}3 ,gt2:+Ƙ37[0HW|eSP0iցjُg/bp4blq:$=YcGx8_XrD=s|goR+o=g2ߌ/˃? Ttc<w6jFc"]Pq@ Yu?,7+lA4N <\F9#j0ؼ][rb'TV<9mGGeK UyĖ rRvdgK-i3ْ-ْ;m- |(vRH8f"bP+Έ~\ W:wz·xbIzKi bVNglag%*OkVroO;K$Oɒ;K$;Kr-GqFSRg:&JH%i0F (*ũ̔TU2vdeVrlI~ ֕} 4Tu+ܝ-ْ lIEْ-ْV?صmJeřqDPb=2[RcJdgKzp̳%}9mܴ$~+܇5$') .owVdgEuYYwm55vCSY ,MʶˆaљW78b?z0K {0ZRJG2 O cfkUprB;ydg]߼ŷ 9룳b+*ׇ.FY .•/] Wtʗp`Xg[řU"ER|(VNbq %ӠSX&ԜzSj>/ӳ i苊T"@3 禔}ꦻ yJh R,A6XNRɞ-+g(qԎHeaͰ"2_2K>y{ 1o5l D,d X"N(Gh%5fa~ (Cȳe#Ty?G;ی֘޲16ݹ?CN_b7lT㶯h}'lDl(n#\p9pQUbF(vE`$yUygR1ZdYԑ~?)fw _sK/O2OOvv}fwR0V|_3bQ ٮw?A%=ĄnhVoO~뛥[=1̿'?sok)AXbxD|Q<|zra HASs?)*H|9YU_b1X3Rd!@2,37Q+m0 d3pq`Həl9p*[ ڐOɲ>O @WƢj/"p#{!3hXGH#b|QCH&b+d#;>܇ry'!4E+RK)CRc Eb3b @!=] #vr|2u0 &!.+x{曧.-Ηig-΍mhVY\)8<;~MߧωLM*'+iL$ɹ繱b#7紿Ĭ1\f 䶝SNCCo P}O?}8/W= HoSM0#4VD7B# pm=4x |hK^ı(ANƞ{U/<,4V4eAɻr&uy[Ȩ7..'`q^6W7MpI1:k{SyNO kċKC.ۍ훶kۙ.(SBms%ګ_nn icDxrTʏ4 @yHŴh"o4+#ߵ셫շf޾B%K(Cn)2 h]:x{%lg@W@KG|zRpnՎ2*QwW%Z_[Hpsh``)}brbn},Y3[\YJpW4 f}]lc݃El;җ]MWףs nηb(8;G\Ud1]SPS'EI%qRI[I@`rC'g/lHޕk4+@Ƀ s4)xz2*$됌}Ѩ:$c_5raCurkCF1--`\]OKMl~ER/2/5ժo-G5 !7X*QSk-[^fHc!GAy KLF2,Lf=1CXu. }컇tGQފkY w u)829 < /1 8|Y!gZ0HGE~֠@ñͦfroWrٺ;+XpUT;RM 4wƒjA1jQ`<'NfFV /BZfׯ!,EY]vwɕ6|pR %[yә|B*)2ضI`A% CőaJ;=>{fL3 ݽ@+)h:\hs jv *YO?%l1oaAd`*so@ "7L[bf^>iٯ;۾Uny(ԍ#/x =$i[IZ*>Ɩ8 B<x8yEV+#H6!{`QQ%VftMcmT0DZI"ڇ5GFL^{3lLϼ@D Q*SegXMFz#HZQ1k.pe %Usz~1}4h|9%7wtJi)U#G|n,H.q |4T5[9`9gcXx.%Y0xj3&j X LB:s1sYW c"0 HARCвJrYpO%#F lz񽠜ݩtIlEi2;/ՍY x9Kb51 O^  3uohUguoI'E}|+uo`Mr'CrP Rlhe[]fս鱒GqWkfZѠ/ :R TND4D2 Pz>bIП5Qu!JLjbRBq ;EV >: ne# ;NͬjAFfeM.;ElOCK|v{T$~pRvOh4G7Z1kĐČ siĜ"{Fth;g$ZLh{'I6!ߺK[$71x`@&+6RpAvf*3rƆUl=crEƅfҲTRm` iEΟLEmz*1Ҋu]ϹG"pzjEi\u1Q:Zu(-ҙk}m%r'9{Fھ3stԜL q6s 1Za C`OWM,5Sc3rsTRtڒt[PArL( pGbO\r$ͨ+*A\nlx|W/P|2R tO? ."sn7%X#lHMI`cI 4r#6n VI1mw$-QDIٜvђD9,d0IKV"COgZw*dha#lZeaZ&6S\'d<)F:畏Q+S>QhB{6QEX5j-2Jz3&k[U6ZU 8wl26idVVd8-}% OɃl?Kv}MVpHV;ٴZfl$Ԓ羽U&6t?MZ tϬǧ9,5Έ1CX xM Cbb5G8)Yg2di=}c fn^ٹ)W&{fܚHvK#H0 Kx_Y-1v$0E͌/ҌY1[KCs׊!Y+@*NDE)2Fu[0쏉,P]_~%DE۠s\) PGETɁHj@ tDd vU;%+!鼅mo cKdѮU/n`=( $j6Y۩5Rp*4M!GZ)xYVx&AcA)2==cZA%l5r"x.oݨo(>o;\jk:fE~K`+PqAѭ{Y]=\X9VEuCCO׷7o_^%ڮno.oo{=4w_G}r% pͤrŘT ؚZl >w4_(*oor=\Nh~Gr>fanf(o~M@xJZZ9MI 'Zzx0D)74(I;a8O9Hl8}L+e_<1s/MYC}fF25lV% *`aa =%y*#r$5Wn3#! []L-L3͹W'w֤SN̷]4֎`d.P;n{k9|7.;ru7̷ǂb_Y XT=0j9voks&R#m=]yW^`;L H?UX,K},hc*h^XuuFIcm|.w@;et(%C rABI#MSθ{4RY9\{8Q /Ҁdi)s:i~3^~bE]g=|tp\Ohu ;՗y 鵶Fgu`/MgiƁYD>B ik“ |KeoU I"I- UI+Y|xqe)kAWin^zVfS@6#??2Vn_@ U6ZW,JSiDL΂$-_`/P `AGk~dҝPtSXJ5bUmhn/5t˾A* 8Wjs+28΂-xe=ѽmnv.bmDWB#:v5+%uG6;?[ D0ZЭ=W/LfL2_Oap6ɰ.[֯ Nup/2l%>iE~#"Ixa~lԉ6iKFHrܲڻoe?ʔ Y&ӫꎖfm,PKO-ys?Hrp'oEu{%%XkAlnv5S4.,eZ.kz%8ZQS>XPMF0l7|jտ<k15/7x)H+ysM}Oߛݾ'G/qvAbL/71hx9ʸʇzc-`SޫM<Llh#RnJ5[|P*nʝYi)GǪUe*_9jRgAvX,z&5YeDKQ`c GFQ#Nj0Ai(Z]Uu~TM`Pą_O0LJYXO|^߹"&MsḞ?+?fgo A$m,%Oz^o#a WdOC|-.x5x#D~Z%ѷۆQ3zI?y ei, 7bIe}xSŇg 77!@X."V{2 o~a1N;]f WۢwV Ar8x!p Ow҆CU!rĎ^:<]O :-w`G1=c.OzhtCkj{s'ĞdxD-vDvk`]zRo$C[sԫTB݆^ut{}0B>QOj`^ue:$WmսCξWmrQ z41J-ѧUԶO= t^X$VՖP߅wTRmˎ8 s)ES8:rO U]h#`,>0D )w4N]bFcv:Oz~O~tsEQ5\rW3F q))yH2Nk7Ol%-J)3LSv?>.MaAW!WW$ǻSV7?7Wͪ7u͛U;/5Dկ?aUo (מ7͍"By͜8 ?}^}ObnxfP~ե +xj_TӐ4d?!>dW/pt[ $pRq"f| }&)O}2jTw>!< a{smgSM~~ϖ ȇ;Y[O.ÂY%(; <+jĜxf{_Wy>{ &Zsb"j6ww'%7 V!B ƲɪX[E9z6gbbo{"Z6ٓv^ySN`)PbB p rQINlW5<<^͗6aq V8ZijǛe/~.&6}⊹c}k|pDHqPwdYɒLm Y/yE-sɽt &MИ1BDԅ];ՆOR҅`S^.FS?쿘=4sgnؠ [XbXvE/B`TR-ڸxpc]I/%=\]Pp-/g%3ǻ_~7Xw"BZ)@xG̾Wg+p}\,/෸Z!v+0jJ}瓀;cp"oϮ/HmyJΖI{1\ӂTpb n&=Q{7n҃TeidcD+/:Em *P lʺ3+rWwy#C{yJ ""?|QW8c1@0ȇFقo\ $t@'>}oxcUFWX` Zvçˇz(OcJ@% lyPcҫ 6p+0D DnnZH!NX&h%8F dN9%\3.P;薿c>UO)f9߫=YjMws~V_1iT_`<ئw>a5sIU ۺXr̛V>Tj5{|\c#Po,(%DZQ͈jbo}rQ`IJmM3%:4 U92pBhB9Cq75Ousۆ*]On٬'wy^AqeΜjtH=!`/!J" % 6@ ךkIV\e:q}5*[ }{Feaa";eI*4(BD(&(rBԨj샅&*zi:}KSn*('<f,N3ɕ#p val$,' :8io G'R$T9KhR${<0r"Cf., 7"NunծJuc:m4nE_ ڌշw+hwBp-)&&qx7k#zRNiMۘkI}{z., 76ŔVya8y[/13N Jl@P"]m= `BBa%RC#8gD́ ^rbadD`Z3!+47E &PB rT:;GZP INc;Zk9Thc@zQ DTNTaeNQLO0.GSধf Lkd R&KkH ig-bt(!H35W7}5O d7Hcego :k4 U[˰AqkJo#8,n<"KL#F*;RKjv֤ y#UJS:) DB@sE "1zPU&bn~s)FDXxn W-y$7 c+%l0*$xǤLXh@$֕9؇5:jC&UcWY;( $dAHR 83L2d*P1z"\mEֲއy7F)?óGBQIJ{+…W.XDH"VpʡzJżbѻT#L#e;4-iOSk=-IɕVk5 i1YYO|0>V̂]兒BX]ܭ]"S w[n0QYc9EoI**gr[q,C% g(FRNŶ[%1Zɢ2FL$Q4t|òITSG.vZXfMF ݻ}s}Cky-7GUwܘOKagz㸑_e7gؼ_! 8ĈKdKmgF)v$Mb7&KA`{fU}Ū"YzKwAɚ$q|C>P7.|;pUfye'o}x`XN//]X؆\4J4iy3Hzfa$APl$HjAҜH2jXUH4j&Q[5\:p6$!is,YBCG\I04dՄp=,BiZE-p;LP} 5"C9ہ=0%#fq3xpc3W|v.Eչt'Q. H%xO'\Dy;MPNU6L)Gn 5-|)ոy5(28epLB, "$ aq E͉Q{hXPA>;{ *z-w $]G[tTpyo=*X'YUzx_=t}r1zghgHEK8+[xtԻ~kl6&ן~)[9'1Γ%^srrIn:RC~~LBТ3@&K ޖiIt84T ޚ#z<%Ɇ0+E+HoIrq.ݚcļg1(4ircw^~IGsc< bt~DWOn2?7LKGi4~Vh3a-xv*»;p$՘J?@ DA3ߕe Am?90&|C4)+R{^-Awb>ޖ9ejN|_$,SzϮH+45 (F`2f˵%4Dc V$NL~ܼ/8D_'Ǿ䤘]N|/ń!gy3-˲orm+ii9\HwWkI M<$HT`fsD}^@?f3W A5pjgi| F J"z&En[f%K3IT}PaŰN,HVPf, a8Z[lt!1pgWi29w]x޷jw"5Z"$אD}g݊)Ⱥڠ SoUdE1;G(x%kŗ2&}D8c]%9)'RjƅXl7DN K!Aewc͢`ZEcgX3yҦ9!yf W9"%2zcX)52|}Dc$Y[IX'2S}!|t Y)*(r & IK#RTbujkhG+VGi;][-.t(JpRECQs)B˥ ܀t9#H:q`ǵCN&@ܸyUp}+Z%qulx<`.Sd2ÔPIęVkRYE;s_{}6Dِ,/I2d+!.wPQXP+3i0!c5>@хdd  C^ǚmtM[bRp̉ $ ?psޛL,זgc@Bdͨ6v&>+Q8fdH9ID&cB5,gU7mޜ^>KAqH?Atڦ{{$\.Jԭ rďlvz?ޜ۪k?~Koǿ?UG?HrU BsxL-ODUYq_G`"DQ^=]ĩ[P.*kGCO/<`X"q*f1:Fk<ਸ਼kkh5nCC&nۺi`Re:cԱnS!{nɌZ64䙫hN+$)ݠ6b-zd8$;t1hsҩ=b6UaZVC8A<!e9'H2 ƴ9"l!+XtК1ٌj0I82'4:sFP(G1 v2+A,'0qLZM) ,ހg b{u.`ʂ]ǟ9gU4Ex =zm(Vש2Q1XTNrnmh3W$bjuc:hD4uKAꤎQǺ (еuKf4ֺ!\EtJ5ݒ4I޴z:U1 :F<&r3mrMH܆c:.:LO&S>lAC&壏8b2J쀫NuɌF̶!\E7tF}Eb h=4ۂA4!s֧-*|vufRs\" ުkXKRY*wLʿ̺7jn9Hoc>ZkGYqHf ',ytrZs!_ ;C\鉽-Ro۷!R%oAW D^\3)"{?7`jƼ9?~gw ߹ZKx6Ä{rWIS;ĸN _Zu$'UEBϷ.>j~}Vk-!u<{sc6}>_~Qy)3XUf7W.3?[d>E)8&{Jg}xwmzӧGZJ$BA?O|ȫU^9<̍G7}r񱿔20 'x43-SLr'%xūo)/nʈ.BN#hkwg{ u͋ŀ`/ *s'SwY e[@`o'0ξ̅WTɖe2oËND3n?}4 UQ:9'xJES OU1[m`͊yHK_? DA?iȐ[g uL{+X,, I%%[8uM:hh|x'!A:[_߯h3ὰ7h xj 9xYRJ·Nx_7UL$#}])#PRmUUUM_6!]^\Zπۏfn/vo͗rn^`泫iJG5b9vR0y.e Ig5+&znޗ/c_Q$:Ip=ۨLoM3YWifo mh7kzIB*/FϜy$Zٔ;Sx};X)i.Sc[6ڊBf9ɭb(wÊX+{[Q8UMǤpkz -^Y@ѧGP듥 5W6ԈgF\fBZ aSH 3I=9sYюމEQ{LE'$}*X3|rtq;\(`f*DR0)wZ E!rZ?K v5HzJ?@BVZz!feqͰ) 3ۡD*2c.Hs躑ٻdWþ4}z<I6mº JLﯚH)<&<R쮯FÇ饴h@^d}W30)CHbp9 ԾB#cywJh XmkGdfKSejmE0: %U(ԡi2T:T>Qbv@w%f1HRxPQ& 4gA$/4#Ykʥߴg /Wx벅o@V!\ Nj%sCS0STqlHKfkX?.V$`Tw" !sHl\-*sk\[f,#Vi 5"ԥЏ ̓Qhl/\RF.ݝZeÎ|'{0O$W%cAx]XdͰ3C b%% ,xQ[ט&YH878';1==z̼wgI&1Yћ/ L&&NXpGx`QÈa.ǥ'#̖o}q6 O*zZ+y=@)XL}]]w^eF>PlCMw9g/igkq9w8>EPS9p2'r~i㜸1`!|)Stewӎi#Il=QH!N:ˮM)moO:F?<$g~D#nhwk8bQ7{}+1vij*ЪMY.[i?  8 x5a<5m` 3+zW`qao)[1mQ̎c &uo=? \!gAMxZrD N,@x;r\i<0? ͭ6L4S/XR~1,_O&mbYVyoie҃^ڙVJsX |{ذ&`i&5/•tf2m1?7 #.yaģ`0O~6{ULzD)Ͽu(_JLx8sd 佈M ]DМd D(3N3q'-!x:4NVSv^il>jtqhwd`ˆ. ZJ%A6~3[y^+ζG4fAy #56FRQ!\nh &I,RP](BQ"(*e|'E2=u)'B#jl " ?Dl%*4!L*e#Rʳ|<+YAQ&*WC|vvK*rtN:5trwe j;(tC*.bz)Pp[l+yԓ3/3vv]~}3%AUnyPSPhxMÒƂO {Wo4ml*U+!B4IPYk[Y덝jAUQFeq4qȂ P$QC(܀Bd68yU DV2qd`(Z>#oyK@ZF-ov0x7.ASQY9{V .#8Y %9%vw6DZ!{6ήt=Ff)SYfiZ u]T~U+S)?5{`)kޙ=E8E{cn~;zpm*2|/s{W ^KCi^7ʻ&ݡt6-ٰ8~0=ꆹ|Bn9G`;6{WkƟG7|63;Z}=AƙY[^gmPU{`!legnsT. TLJ-P$1,21<)HBrC C7`TjDDL*Hp`ȥG}('( ?81B%Thl j#c-f-5'㢒BQ"WNx{C\ \vYf\\k'+sU!NW]i: sP\v촋e Fnm_|_^K:#M)i뺝u`hPq)ֺUfH3ʓ%)m҉+k3RKͥ[:C8B8?>̂*W>ׄ|12\|^Ӈez]y^K=(4a ԄZM9!jD FV.C~a,u^k%ƹ`[U N6hŭO9>vLhQV|"ZI4?M&wAѩFv-Bzڭ2E[L2{L#KXZlyMR='8=Xg>K_;K$.,[b6EkDr'1X7Ա^d'YOB^ϋ$x< QZĠI&iSnr7m2~8 ǩ|:SEJ!%-RER:7]l ROH1cRD ;HyHqd_u ||r7CCXO➙p2K6qv e~@gc0ptBY<  +yEs&ۆ&.&Q- 4UI4ɔ`YfEcm3S{0Hdnठ'=LǢU {"9LR+hqpS.y9MkeltN, ݽ{"ihcGrvһf~tzja͗]Ӿ{+^^vo`/Lg57 "H7S:::bI3 ah?'VŞI|cC~ n_Ҝ ܦ5u9zk0J˾@At׶@l/˗wW){8.a[שֻF{S2NpPviߙսKJ{-JҢ-Ik*-ZѢd/Z-`s^ 6/gS'nMr4;KKwI˥B=i j04M4s˰k'։MlE'<+w7ߝnLn׌: C+HO*4,9>D#eX@j#F˰1Eoz8C0tsք)+PԼkMgGqM@R;Ú1V;P-}>& ĥ5=PfʜRrAí=% LV)GS;\;pKLFFORT5z! .&B2ؿ :g]QuuS:KvmXaXփk؃+`[X# x3u&-r3VjL&t "CjP,|W5]B`/pj;2 *̈zNj̄iK9+ rsD͑3ߐk$S +U sy}N3ǻ,T!/MK;xK1(]>2y %}fA|}"Ugzȱ_`%0nO va(%E%qsXTȪb1pJr;\V$e_N9͘I,cDSP\HK#m"SA'Z&QndFsSdcL?`~32l笾do&i?ޜ;n]ALfV-ץa7߬O#ц>xnYsnM PQILH椚_"cD0lC?dd+<܌")qi] ןLRT$j$3HE$ #DZuWvMېρw<ǻflpxiwujݰYqk-PLY;~IMn{Kj Ǜr;eމ2Rww=g(I{|g[+\*RNыZ>QGiKF4.к\r#-8Z#JˈGq%mW*KdlX;79|R)n-x[5ߚܺcMh6>4If T,FDqBr*Mq N'B #8-Pi壯|r%uLbG6W\g #,Nj.r4KS`9\*cRȣp\>3Fc T2KAst<"1eYHQ K@K,gX>!/Fԫ$"J4rrG&ɞK^P="jLy<5y$lq{sؽZùz:5R@Mb~d|`fg;v۳ɗOe*ۃ~i/Oak&NJI LcMDQW{Q=  r;|oDZD1 Sq>VBjp0ն @I;wd k~6WY^W&UZ/J/ 5&R[~pz9ʵ=xB-@8cJ`Hܓf3͵'pE4i֓@H]z|ͤ3멱wNH-I;WKsyM5a9ou'$6 ;Q1h5>pP6fWjRݻgE'?Tj-8OcTHfVQ|?KY+R惞K͂!n`Gz\Po*\"5!UV_A+@[?c;D&h4 f*gq(Y=20q,A歺'>iI%A3Hm,{JbNQ[դ$P*4z~4@{}VrR=XO cô9X%>ZM6UͶq𽔊YOc< xy,sE6N܃RAP' s7hp rj؆Pk 8]x`W\0r~|;jF5 Q]T- цtxȻfE(wF`={~B0֯3?mx7BSb/r~,hN gZ8%+!eV&d{F5nWz?"(t૥qb#@ݺ7B #n) /b2TWY(ݬDAxPN0j-.B\^TnjAܰ =F>ն% .qQ&X Ֆt.!a21a{} ;ntZF˝ ̨Lis(1RR5,phY2ңPiÀ97Bsm=zE Ӥrmh%IIUB+NrlcK!2/mbGcJbáAA ު|잹  ˀn`#@v L<`tCpN:7@J.O#t1&D ztePC7S W,Qfl-\aP5pM;v@o].fݹCgAɾ>@׬$}+ID?IS`[W8l6_怨\{V!|F5MmYGI$Ʒ m&FV6 % *J{W<j: [3TTÿ~3$5㠎#v9&>䶺S4:PC{Tt5Y6'SfWzӶP!㖝RJV`PZ3*'0!J2diz[_ Px}a\yV(ʤ!MqM0zRN)->[3Nn 1[sQi,oT= f% ܶDYiX掟RBܿUɖb9xFUy Fa '[ *@~s@5u^Sgeݷ!]_ҤhCB 8=xǝUY_Sw @s&u%^ϷPTMt]ev { $3Ax# ^@*j @مˢ\j1rH=IT2v1UVA拷ʠ?-^VThAVگق-  ?sp@Fk9rBfLQi4iMrPRL1%l3`k7#XuWfn=vlOƑ+m=jklwUj|1siK'4hՆAUf#3_UMW}]nB59r:{Jml&ƝYLo&74wdvC(mhX =R"}!OfI7F0\up$qӇq6Ck]2 5佶hJmOmN!`LdY|\&Oۂ{6ZC匸dĎo!]j݊RC$jE(بru^G񨤚I7Mm5`rP:2iqK0T㔿Bh56o~8XZ笌t{Ii--+"gGz~&CO &/hplf2O=c{+E&DL#mR1IbxL,"=żo|n_,'/WW4I iM"(ͤT,1YԚJAb9ԜD ^{-G͞hx@T@} Դ@zwoBgڪخy4p<  hksǍM7mbQ hi? jHOc C8&@M*åhP+c ɣćJ}ߊ>G Y[c$ռVgx$P"Q഻A@z '- whVIq  Xi0r$%\1oMqi"=܉@-ŹXyexsJ-_tnj2dzj\_6B8遙 :|bHRLFu7|*~ۜ bz_ifpF2ŀ7d0KM`¶ȲFlA%[Dmfxj+VZ 4DC>gJLFƵ}1POkr"$TR?U tduϴplB (';.7?JD0# S K82hj2Q_\C|asrQص49t1m(MUJcb{PdGlnj1 ԙY \bʄ1N8,1w ($Eiǀԕx1{)ak,Z"_Su*.v[]wxM.".% | vN.}K^dEi3vQӞXnG?O4x<1hO.:g5AoG ;qKQ-ȷj* DjA__8 SEϸExxD?Ԛb~4$vwYi`&&?QMa<|.0Ja(0EQ3ӇYGC)z "a7r,Y0(m MpG(:4έןU:3J^5a0W"k ;@k;3eo7;sbpFB]EbΝ}b\Kd:;ΕMrqN㫇Oh2KSY^4~5wގT:a) ^:9'?<ni؋^7zy_f2 AͪrqJIG`nA4&sdl7>vX=LlD@Z9حI_FV~\ok X{("XuX~mh7lڥFPECо02G}2=ȕ2mGdZq ՝x `i9`dӀl^?o{փ4U7dj'ڤX@n}Ƕ~Tՙ$Փ/kߨZ|͎="07Өk(q3_iyTYga<+x0*4{[bع(ŘDSǸs {P` ?L bH~nTF*՛ڗn6e[2L4:_MV0ϊkNǪ knB^%K~g,1+([[/לB?~]Z=!@UKi)H^Y<XW 2 c]dGt&i LH4Ieqg1B%V r&\I^ v)-I,}tD\Q"[%a"M\]QV*,|KƱ4BHҜH悘 0զR$L h’L0$g9F)҉4Q hݟE L?Ԯ7T4Y"}odI9; yc<fcߍfӋ+5t:D ;=S`zι hGsqn 2pxtn}({drYq]Hz%.H4!oフYvA17j2huB!v4Dn ^4]OJۣe ("׊/>kfW/U܍/;f觗e>/Jq.Xy.SƧbĪ շ&cN=$cY)cؑyAx2!8u4eZ^_x^un^EEvxS^oBjP7NSPxG8p1r ۙ-}:{#7noՌFi9bI/}dˏ,=^"6+MIdw[6 zCfnh6UUwI{=ݛ3x{ɆB3 bf@e pkA8 ȧ{)ugun5uSohYKATβ^|"Y݈VݖXqf|xxG=q,.k>| * ?ILXF<ⵒ輲wZc@ ٱ79{ C=џ2gseP} X8Y TqO_vKT(VlRr4#'ʧ6oO!lVFZDB4V6HqIӃ?È\,Կb\b@ ! 4#kNx DŽTђV#Mҁ%f̀ծњX*SAaQ(kC@ -x"J= Z):f{IYYܶ|K jMhqߒChu1 Yb(r]@kxuU=%TfDd$z~@hv,i<0L:6eڲ$3cMĉ"%d PoCc mE\9&uQܫUߕI*m*&^[& J/lՏub9.&.V; U%vzOdr (,#KL a ;uJE@Ts@8 V_xPrP^U&s@l;~S~r9󰑪 V/Q(e.˺9S$z)'swv:@EX !wi|(݌81U1N9 FOI! V`r,HI 越i<_dHH9sQ;r')`Csg[$ڽ>鎊ZV "[lۡuUW")e=Qy>("ԇͧ_j<]TM4)/_H3geC[h.DI)2/e4 ^Z1T9`pmΜP@ tT/hN6bXP,,q,Tsו-t:6Xqb!{GKTQ YNz I@@!k䁯MYbU e`9` q.(Ț<]==3\RþJXW8mΥ i5\] 5(]5E=K"9);XK1 nW(Tyƀ 1a=r|nLU=j#k% "ƵQ+S(n)G+{(+0wKP]C $k#(.kme,v )lHG4'Un$:fSsIPZb>fҖAx̂ Z NH6lX8aPћ8箍۶Өk}rއx:=&Pݨ}7~ݨW}sc CH`I0ob gN/HTmwm5 y+V'spnk|JF:!|9Z}Fbs{S'{:`\7"1HʭDEEⲈ\T<ԤFxtWEFY߻-Hs1 b3S@ gϸuWV n Ϟ" gij1ͅ;z7AĊoK;8Am"sd"ϸ?RvӀ&ejD @ )LYL8T DA~.n!u)H[HIxP}b4@,hO+7 Fk#QHqh>yrS"!;+OrK1h3Lyiq0DJ$)VЄhۿ3GhZYE2`=QfT1?!~`'W1*C$4!~tHCQ>ҠTĹNl`!hphIUWPHiz `΃B]ƪOMuQc(%jo\M z`\D8tps%:E ONn FL(%qB'mz*d֓H iOIOav z9c<u()k8y< %N+0;кRy~]Tb&q#1gJ * f=pe .%)0Wc1_džz Q{7"!+uHbJg~=ϰ|ne?{ ^ji ;::JVYҾq̠(fpuz#@ , Xd]&.)@L Ӛ6>}riX)d9YngO|۝X\ɉ3dxS]$m:+0+䶻 n˜ݽr?uOh(NWƒ &tV4OMb2r\j+DR*w}qFL{0Z HKY~?Oç(q!t#0Xq_X)(TJֳaV@AGoS)*7sdhi4t;py֋8.muX'*q"Ң 4:&T4yMV{u9AipcUؿ> W dTt|0X1O'?& ɦ_$ǗyM7kޔ檀|6yL.wyIՁh22:*nBc9B8#X~=0=*vS$?ϧ[_@~=BYu&cz>^DoCo|Q#^JY}>I\Œ HL,asbBĪ-9>\qj6-Exn9myβR>Md[7amflX^wnҴhf ckj&[m\1f'lrg[;Fࠉd˱,vqblX( ѻl5 \ nVCX2]h>b۽.9`#pҸ<(@oYSoAeŖkg@U2@_MP??[3w;/f ģ8? +Qҋòdψ+T[F qAC/'K^мc"*^ӧZVSAD˗94;cACWyK3GT% xa|*EK͢_n@Wd]W,p%D/7yH?{ؾ .7se2ox2ٓ-ي_D E ,StwRAlIR}T{>)B|ɂ/9/ гAXLlVJZ5_&O38$A7}'yɠgYDž:L@2Q=4m~|&((yzNeyd&(ި)TLStj/> F2QİIUY]YDRa-Rask#Z $m{)IhZ p$ IT49x S%59fa7=~-Щdh  DR*eÄ >t<)*J{i؈%%mY2`8CQ0{AaBD鞊Ӥ.. &K &ڛkoʽIM95(d Ͼ˜LC8dFjj\UDkR׭&uʷzGdc΍ mW֛q13ZL#K 3 4r2\g9[RĎVHYXz]5+cNJfdWUEy üJćy 쵁a ؒj,AkyՀ-]qOF0lD ƔHg!C՚/)śE5ʸj~l5fyl4EqV5fWS]ZQ鉺M#{׃@Uq SV!{,Hk.Es#uƺ}i:Z#fEnjh"ϞB+4M¶&łgJG0V3p6`k K%K>1Fqg28l 7y~2 OY&X4MC{R+`TddfС+@='[DΌPÚQT+aدTFhnD_E{L_6Կ D$炅S`؟l&4;χH}}Go&:%x9)t`F ZH נٲΏO._eVF:RK\>xgܝl]zuX !&=x&[Kwk#xmp:q[A t咷f;CsE!|XU)im[}[|رG|Sz4<(?>.ilu["#>o,r7~}m {7-F|>}I0BKMhs'.`Ok h.<󈱁u/NzLe csy> 54{{QLP&dhL¸"s !Ɂ:^sZ:{wsvT@6kX siWš9u5'Cev Qjh䌱BKӑŴJ$t€E4JP.ւT+D#߇,EXTՃsWu)X)l@ŴCf0FɋS B 2A-Ի\f˵f902!i&zۘyZJ׻o}j41;J#7{88嗈4F턦,Wy1Ldk4hSh{uP!J[h -}ޭl>>e[Ji guriߌ30J9HG/h(@$V oDB36=m}9u[<4" Y8JsC LtsоyJa_稃#639QJ8ꕧXnY+e"E )\xZ$[IT>P҈˺Sa LK\bfvi[+.wLP־n`w^ǝM\Fp>ibh_9>ܔ&*^u1j!eÛ\Gg"%_gapw|u,֍ޏcX2Gss7BPتSH Z#Z,/_Ucl::3F:N<[CEXTZ  %OC-_ww5o^`r5Zo(^>EmL #_FKc% F8d.F#D-~%A Es''1z%$+Dѝ`v^ XgX ȴa3NΉ ` QA-DN:^mdeIXjjZne+'!!C͙3k29]l6vM!*D6]lfmlVraXD$Uu>CI#xw;=ъ0i)s+tw0HQ5|Y0&6RKE Ѱj}a<<Ɣ$KO _0ѱkÍ_tAŵSz1INb? N^P8,N&dؘ*xA1ה:j}TBĭ L@E*'˽owq%Yqn28d{-z4Iz8!ms_㗧 ]]Q̛ԁa> 9~^Rƪkv~1h2Ĉ! fdPkW=m<WTrx*Sk:/|M 9Ta.*8M4g8[}xF)[*֜)dT 2R'W+[LJI/5#qtN> 9!!0tR<)ڬZiteBբDCf$.I"۲ .J.EN,R3BQ E(+8AsRbgg;dpfn7-N_daԻ4h|_Ϋpyd곭nSak.sX(["~̀\ԴˬI.N^_\[ݏ[8n6w>20lvF6-7h2xFqT"LPVKȠ1oEN 1x~Xn]]Dn^Lm`eV nTFݺ9ɍ.:mWBr]ݼQWeTF]*՛7j#E>꯭I*Hz] +4HaCu&@{ vT@ipiL4$  \DXc9%D |bl\J ZG@X!D%mj _ ybN0,5۲wCD·fCj is j:t Ut{#t 5e4݄h1wgIݶV_%2o G 6 `WG 5Azȑu8B#8˘LLE{|:2=q&h`?AF $M _#T3N/D}5[>TF.<83Z* {,6a M l'1ŢfڢFΝ!;}HzMCˤz0T[.Zk^P$$.+$2wEJ| !@Bate j G: #LKEp:߻p'3$'>\@l @3iP2Rd;@OރhІϭQȩ<_S.R$sby Lp=EA۰aj*aPlmq&~%H L)2D{!q" SZ@ :pad%{£TF BO.10# ?-6aYa#pP,s6`DCEu,YGd𠦳(0Á[$DvWTnHͮD\`I2ZzRGJh!o#-P

_oT[aEV?TLԡ~> J @a3 t}[Oz("R1J[ǬPO8^ yHX抎 '%_SukWB$O.eKDN]S]b?_&ӏn,مALܹWC>7 xW X%X~8vfwQO>&f1U kؾ)o!lc[~WE>Vٓ!^,x_KtaۤU#ג{`㦜 .9+<Ez!l"%&Q@{ r@8#(e9wR:lJU ,[VL!PRR%Y'SPc-pJScĉ脓.d%q"epؾ@QP/Viǵ^2:|u0OS`kgL }3댉)d˾dDd}ؐ.9^1玶 Wch$"H'X=DŽ4W SRt3v\0W:⤁"PJCG%zeVـ ){5;`Jp8i<_ѳ22);?(8Dy*9>RP8T8ӱGw|:?V6glqS櫟TjXNǿL<}Gh[{ kzXggsfv>a>c(/ ~RP.k`c? 6S_˚S=t,IP+|+.Kt=Ov[_ f=Cz|G_1#!FV|hXYr uYl h4[1IJxy7׮~ig? |MFo峤Â4łM/52ot $_t{3[͵^Sߌ˿O)kװ|yϖoQ,mpcrdaD/7߮w>WƏ jJ32$\4s怣ln.m]c&8Ƚ!G9v8#q{%#mz*Q,9=N+ѱ^q맺 L_7 s18n9Ln8[ auZnG8vg_LŒt+}]h||u;  {~u; o"^غn"4RyJ%Pzf,7Lug,LktCxu$ }3m^]xu: mwBgSۡwb:;zXyvi9)rXtTwkTfIǖ@Д,yYҍCt.!iK$k\9>? @nAQz/Y}zs 14o7D=p.?ԃj!m gtG@6kmoCB!ך5#f7 4ͥaGͺi{9Qr uK64zNK Ĺ֌KdPsf"]|K%BM6PGڮ(ఙ5fka8ϩ" .B.I-@  MC&jY`$2- p^/+ $RQ0wY9nU.+ 4vUR./@"Ίe/ŵ[uk H#ԾQ\xv $j(;Ĭha!4ɮӣC#9Bl``ܜ9Rrv<.١ %%:[r-# ңd I{<ҭd C: zY/QCU`Y`D R0(j 0p3@rKTP&Pt]OPT |^/?0 q9,9<9Dr9#P_Υr.;`" )RYSpϩq%>HQpY `~LƮS)i!1SXMhOt6>/9\&&t7jOwo7$|fh8߀Xj䑲G#5,Uxcaz7pOn<ZҷOU5?>?tͫFAiĽW\_S@kmpϝeALtPSKK!L f 3 B_{3Aݧɓ[j`\yʄ gH&(mHj$[D;oa9kuTEDRM[PGKH`F&vא+kRHeV;J*nzvƺM-iV8Q81H(H(JH}d~}Wӟn" P>\TXl<8]JWT]ASP9]_f QXx¨P™إɂkV$7^(\ p䬳1 UGA)GUB*-8#D?u'/Qפ rQ,)Z=#G#ׇ߻6%b9)W({|b *Xgx M dDT( %U^@|nӑmd%)ڠ_0H>H&%E) p\(*D 42 Di^O$$p'1s VZSE FpAi4jq:"<7E&$0+ 8ݷ)%ãIBLrΊ`"&Q^L2@}@:!H']6JW$LKL*H``SbBJ*P %J٤#xﵢJ=`*A U40j,DDD!tr#&ѯ:#r)%Ya6HBTC{Vd v Ț3&%KXCIF:dR$POyL>& f653.HEۏ5msț_ Q O`{BTuMj^^##g"n72')f\ ,@`PI e&$0eD1!^zk|!h$1`m,$,GEր` =m14P-Vt{h7ZQ񸝇'uNg[wWqxi{Ya7txӳo&#J7d3EDFskUv xە -^"x&nx$6b)1JW_[;Oa_SZZ(TӳS 0M} VBo*V[߾{73W"LF)d\ ߈ML4(-bƑ%FF )JR"DbL +mjq4ㆌף1Go[ߺ!t=]fsK'XHKE@IO|cwrz9۬¨& qkː[rR+Ӱ듴Qg^_,mK\k᪞Aq JVb=Aq]׺!?=Ѿ\#" kֲW*C0`@sU%j[#Kec !T2JXT5S˻Ɖ蒓MK V?[c~0H,ynSK|Uo1[UYrq0z`cʘt[d8\'y d2|qu4C7}W&_`ul{V([,~@JA-MyE0n>_7Y-EE]-śy&sݾinKYD%>`(ԺpR慡kU&d%X嵟ww8FLi N06̴?A@$L8dEѴ?AxsgA$%ʨHAzZH@Q>Q`dG,i3A|^(sh*IiLx!(8/;z[ԜVзHzhnNz!uyvң6dgfعDl!LȏtQTi)BoGJ"m+vpr@pZH>fyj(_<DU/O* fjI"%1! jbalPZi .%k{ʉޛxR&iPVZB@2mYcWaB&{-d2W[wg {C=; E-}JyjF̙ 攡B{aID vIQQyLqRITtɊIL/yc8cUvޕZi4k \)Ul}1YEȻG>/>} A-|rQ޾avq=1/{mN3]? . ']EL 6BGEd+{ =X[ozkee\7; /x_`S@ԋVM/}\}B]h]_ZK[ڸ~U2/<'zEOgxWϗ,|B-闔{ZMVsHEՠ>HIO=E9!-dUZʨ6%H=Ij$?֥/W&EߋR?fc~/HtrK3uQFf׋[?(3dgp QwPؼhN17e^{Fɜ#1TrDNuJqG]iƦ hͦ[p䓓(o[ekm&q}n739ws8_өE4z}NK=)́G^mY1O$P \np/ ڀGODT$ Ͼ!G%L (<: f ;?F;z5<{s7I|7~L3D]M?zËv;nA߶f_[7l{S0c+c6_}rnƗ4@T)qfnSV\_ew8,theFd <8[yFY j:rߵ?` 2]DaDppB1AbKD2J]O+=abFmpL: ~+t|2IK^jb* ?UN!"rB1OeʵI(2Iz.2-s05Rg>PSyYzLEyʄ|Xȁo?m46@ʘ p)yE_ӗ\}:fWIcJ +(SIAZ-` Ry~Lw6taOm3ˡ~5!;(7p}:D^hR~26li+14۰2Ț5[mgF{L嶇s nP(fKIqkgx>Nzpr!Ea-yTU< clTخIL |bLzO8H>c~U䀁 D,0h2qJĮa(=ÅDDe52q$5^iSVgːDev>2%pdi;eZK٣찞67 !٬I2 BD%%^4p8- ;Pqw΁2+oɎ\^~TM:w;w`-|wg}q(3rjP-u4pvC wIu<`DJ 9ƂD|&f=a!ض)5*uv\1 &1{H`(I.G0Ja{IԄuISS?%5*ui"Fyq@s0d[/P e+M#aA1=D4BQYzǍ[IX$ŋlrɞݗ{eqӒZ[ݭ8J ې5lWlVպK.rDCHQ1Vb%vkFΨbMŃ*2Dj"RFa'qKZ>Mjv2-SWsI,Y&&lLj%;l\)T@ܪQJHhҁ<^0bP2zL 0*mrK&R1^^w%H{+M CJKH#de`Jp$6-zF32gR0DC{OXHR`^Ea@>8K3q;n!)< JެrqK^z~ E,ŏ+,EOm!^=../?,o߼l*8SPo 3͇Er m!_c>s:ܥIuxC5v= uYz㞔#+٭ YfaհO1IOx\  YQ„no=-b@r1yzm5SF#4k`jS3O!K”uMZ"V{FSr:H.=V4N+% X ;H~VZc)Z$MP+"e!@r#t :zZ\F XH[1[7Y")D#N+P_%#mZH#T6UT[)*FsBu։GHrVb }#+-(rwoFOܟ$ub}+{Nqʲh'+d\JnNBqR~(f+>:r}*Bjgbb)VB[IyϲmT+sDhq,|ծY1ح>cViۚ DYE33h[׊L|5Sht(wSCfd2&AbDyOO3vit83i}J]39&]->3WX7$h&y8a vLiR.[, Eg\ɶeLY{<) 5S<4R̤chexa׿M̉j` lǁy9yf|OhhW$,==Ycp-d6(Pl=Yv,e1&ԥfԆ"J(SD S6PRb@](m4ũ&EJ!7Xq' wNKR|jȥX)S>ѩ7!&xh8}?|ڊ28 #!4sГ09&:8 A탗b{q_eOXGLngp72|_~ѽ8C=1?D=R&;EcK:##Hg8k6xn=);_W+Gћ՜Jd W*z*| !/?|9)"=yKjKq6Sۄ`=c h7hx$*h7D!*тMOWoB|z xoH`ׯIxwW;0e>6Cޑ;9Io]8B6=qE6i1iz$emyW[d[?R)KY6"\!X FT*߅$%ϥDиX` &};9Xvoo*[a4S/l)h5  $U0kxI>h#cdzV$Be-MeS(a4KY_8y扬lȦ9O9h[Ϫq{eN@5 ( '(Tp<-=EN;\1v5 O'~m?`BQI \ć~Гq<ɴp_JL/v; ]?7{$56HChNڿ}tۧ%D?KeXDѲFG0ǣU?h՞͌i qy⨻-4 ](po8oG#RіeӡO޺IZ+f!K}C?O`_,ӋC6>ox^]!kEz4 5Нۛ z{C^n/.y1/!`'SΦZNy 1u2I=DP BXc'=ցSv|rky; ?ƒ*}[wowmhnӮ㦇E\eK-92zo 7bGyy$tƛ\pG4'V91IA!ɎgNq\dGpHJ3cW?X5Xg%ޫc_AnEj˕~=|"7Q 1: ʽ&  1^dTUB%}ZUQz 0Ze%/P/E i:Ti֐+R;\/I**#A QZ0|A:_5L*p Zw@&l[Ɣz G5UReÕO,R&L/󭍾|sݳ# A/S{1uc(f\hU(UJ'|T>ܹfJj}&m$!QS  ?x[##أZ[yP='b>{83>I6=D'{N8ѦypХdC4HhΚ?$CviIIO@)D&QT؄hr7*y\Zo0#Z5)>+$ˏhIjkPf=~uFc-Mޮzv #nHwGú_Y)"`wL0|1(X nz1n[ jo~Wuka=$OYD2cI{S'>n_C2]G4Ia4#h9od=>|n ̈́6Frb5؉v<'́7!ڣ JUx{cՙ푼Pq:fr>Z<fհ( /·N?-cE4k3re=-c!:׍eL; 8|YV],JC#w/\b02?sCP np>=1ʡ S;3Q팃%:=:O(e0$ܤ'XK7ν M&qxrxIۭ%kc8'U,j5{|_}Tk?TˠʒqG-tY_Λ^E )jMb;gQO%Ɂ\s#g1_cBEJ4Hˏ?\ c7oc9`߰*[j%ϣ-Wm@g2gRsB!Gg21 V Vf![-n> J6bo2z\#H{ADy vіSJ90wG\Н{\|,x Wa"f)8 q•w0>ipwhp$=kX>Sȱ3G˧8G^! 9*@AUSuqwJ1Ρ^Kb8(Q8rk(u!B #sƂ -kJ9*(Bp@g]7,lRQYۚ弭Yۚ5}dI%Wd#zPVTV2rqtb%4w`V5k˟;ۜiNM>=^Y&߳Lg|Ϛ仩 MiL+=$cjmDKBs%!Ths3R=^zG 76뱹~~fI"|I_M\v0# cck:9/W d؊1[}5-SÉoS[`IX^"PC!J\B&m(VY  OSvuFh xAȺ$KHoGH1w2uITl)d_G-6w!\ͅ|R`m3La -hsjL1wlblS$$a gnb]& {IQ zExBUIi{*D"\? H|#U@d%.lN +aԹf*̔"o`ϔ8M3/@ɅPfJǃY{d6&3_M§"0{'Oq%"LY1&0v@cѶboOZo ɸg~1v<]| pv%CW5D.i!p!wDpw]uSubJOҌ//^xG] zyUgCv-2V1eS=:*'%bfS*>)mϝm݀^{SX̽nlBߊ%φJ^&3ϙT\H ZI_>qLf4lSfV47coHfU5I,BmDa"XU dBeC")lD# %TItb20<$fTm4BTD<3@K!_V}; r}C׫^N5wUΞmAvD]|zӐz1ye[/Ia]߆lg Z7wNzc=Hi!f?넔bb=ga r9nk]E"J#i?fYU0b!QPrd5!rTEklkR'c H{ |P?+!WnRL%ʠ$$F&8vhS8bhj`15ZFLv]v6Om]03 l{fX:z41An7=KbѴRTj ؀i،X%V1% P[ 2aCZ+癫ʘo4剴 Id:ea3Y#уsI *FqDCJ bci e:J3V ^U|[ش'z@9\5x?_NLk J~Jk;P},OTX/B`^fzPvGD=R0IԡӞ[g~2J{稠pY{Fn R0+ӰBolJpЙdg\闭+cM9Vt6hLrTī9}X|1LN7|tcuQyvU&7aj@¹'Sכ.ȟйW.d>u3Px2SC!,ֳ-4bk:&rLkRu5pn^3 GE<HX(;$ rSdE6 YaVnW=ǚ y "}94\';v4Zd~y1/&DU+!2Ms4ŽV0Z"PuJXԶ? L*P0t&ӧѨx.,eBڠ{|=yY|kt^g׏mQ \RA̙ Kj`^6PPa^Ba^WaTWɬ:/fYi kNK$ GOxf6בO#=؊;OfɢЙl%a\_>/~UM%t5LmYB%HIF F XA!wb/A8&!!QX:?JhqWH0mzP3nfEǻyGjL=6Jƛbe655)U)fy&\u~U YHE7-\]KϋqyߣYmX $`R%X XFb_戀DӾ@uϼx:H31fw"rΥ"s ө#Ru7=xiKw֝l\7ݠuSMwJ(S[N6:)O]dLKr3يd+"Թ_=GO[kLs閹0hw d}rU?n\5+=f5@Ǯ " SKjA޹y[w! 'J2((Qaά#DL)Thn,cE~r2yݻuZ.+_~sWӱvT%6f* T9yz'ճ6+ss]ꮙ4Wv-M[}qLS90阖i?HA/m #C `!GSD +QV0q nBS\z]Uk"\Fh@v7ZsXVkUw9@1n Mw|25Qnsq)'l"Vw_aFu-Х L{{0&0&!DdbPKZ]K ? BdIv^S ?]ٽ0 q)pgBF"*b(@HS"%!737\yVx0+ HZeL1੩vƿ*'-s_ir,^Jx¸d] 55]GPE1Gk\X5i&ӍKe7BUT*OtvY ZIR{:E) >*!gp_y+\|坋WrL8A6A(Hi'IClV(a(SRLG8aCkXEV}E,|TӔ ;u$X8AҴP_uyp:}JAl{$ܹ ))'km#I/A#կ6Okgq%~Y@觭^(}AT)qf8$%yglUuPϰ]]ةgG*.h &#$2 s2̘%\" q; fΔnh5izc"]Ngl.kklJ 9J$DNN,*!h|2 I 6od%0,)IM^H=d?A|ʒmT!!z ' J&W{5uڭ ½푂n2{,V[W&4s~ mŖ^}hP)/zIWuyٶj^ǭwJP%9r"e3[*LVv1~McZmT v\0cvWr;+S晟O5ʬj߳Iɷy%'쩍o]a7_XlsKmGWw9@drD*;?XnN Q:Gx},SjhBl -&mA%|{ ?/b7UG=]e86V ؉nYSeIu DDW'IBV+`!x\P9JI6]٦a fIr(EABF A;Xkb45pd"&A +V&e\]Qd4TĀ:bY".CNk޺!CӺzgG`]PTi& XƷfLAs]܍jO0YlAy{夁zUY~ _Ty;͡]d NK9wרnittlOhytz;,7,O-],YQ.RL0yD1QJѪMQw`mp@dTz77:ܫߞ]M:>:6-c|ݧw֠r~9 c+DzD%!.m;M*V "kаe ɤJUHYaT:+Q"%S4/WmL\Scb)ٴmJ# {>q-[wgFwWjncnaAW\ a^k|0Q1Um>b\ɼ=<-εVvF(P=̥D"MK b#5gԯޑEt{ ZS+1k%,*!ANED͠Bšr:J1*AJG8'Fu-&! $\{Ab,&.'G3Y`96_gJݐ& ?%,ّYo696^\O9fՍ9Ͼ3=W7ןW4Q:Op2{IeZw<;z:A jh/2@zgX? n^qXG!Z}?z':uv_OG qG2*D\,`ʶDClbrqq>.DͷFkw )JHM2cm  O/żn~o۳OrH‹+s4Dj$]`3t' 6`;nTE;6-{vm#rȋD?uf/'FZӝ1RNqTL<uPFGLufO=b:4ck`twvNH s'Noxی+UCr!v7%|@>JъlHqԚUV7M{ )wͻ7 {0jS؛f}.?EX)_hm~mpZimnE# cbY,2YYtU#գtL܊6EZц|{wd4vq/K=;bq#Ch;Bm,eދKd& Byg"LDTrB0 sV'FcUJ&[/ S",jg;%s%j Q$C6[@dˆP > -1k%lxDzA@ 6eQsqvcŮ5o^8˴[B6B"  )HhڱwA # t"j_5),Xe=pnc ugE![cAQ/T7P}Z+p߭[^ُg 9ӗ75X7Oucjpl;o0Y~;ߠG-tǗWo8rFֆE}nm gfpjtg7ƦN m& 7]_y~v)1Aĵ&cPҸN skCel2"~uL|1_O#2V&hgvni֎fX-} Z= χru_gx-{@X&gS=Dz{am߆{n8Du4Kt:+qOtw$( F}QHF&AWaT6㑋֟no!to77v\=s:Ҿ&C߱6APhxQnt(1Cݱ$.i_qaLH9zԒbJZ_?6E4Y[N_)^3F^g'=jd4/&%lNN1QJ^;({EZ+gjC{Zq' waK;qwM39\Xk#MncvhR8ZVinr79/B"Ye /:A )lR53cINjߗ5VK4}r|9a]*֍U&yMAeƀMt.S&BO]u(9u)ErueGJiC%=сuҚS+'Pƀ*l n~=HYTsuִnG01Pxm?)[N"[n<8PQr2N Jx Ot}`Gfb)ћαJWrxxfֺd= JlΑ‘&0U[36=+cE$`([Â7:4$^XcK߲{ j1bA}T0.њ Z 25 u;HJyNUIeN>^>Mg=OV}{F#h-^:6-S|K>ӥ؝1>g5kIe-ʸ{ߴS(%DZf.¦Y.>qWN: pq『i4'UkeB#!Ѽ*cSB4k>fOji[9R>b<Ιҙn]T>ש@ @{$fNn'P"wH|i%NI~u&Q e"%TC}8mt'%=džȜ4\S2_c{8TO=Vq4W 0gbV2-$4 C8]a^$(1Yhqۆ&5 <.Yz%S < OΆ.Zho< 3h`S)ntr1Q<)c]y|YTZ녽 }x~aZ hRd6-Ubs./XnZp{po8#ƻ 9,\OZv 6=T6NgW8&Y[F(gvVƒ.VJ.$V8 -ahkh+i2E4,q0os<:Jd_N&}Eu.hHS6 (ZΎ74os*Xu׃֞ !Zݜ ,"-<$\!% {a=spywZL309^5ÜP9˶VȨPQ&)&-bbE81J``B`LRcj{a-=}AmZ#}|C>t<|x-9fͱXKF}~czK|٤-9¶-9(}~ԟn}q {6-=e>Ej4E>ԏOMt5q<$C[/zCF` -r}ϫ*¢``X?u6<'iB[ϛ8{R(M !)6\Ğ|!wX{KFU{59յӂ$JL\0xLx_XRl ˍ;xz 0h6{bK_+"8E Jp Ղ*qCl\K?].qnźsFAW> eiH/UKNc}7wϯ$IE?Y#y+AR,.EUgHAÓe?O`+\,Ci…rHE.:f6#똑H D3?}Ps 5\92"=M;.wfKwd#YN"Wu Z$X*v*)$ﱒ 볓ey}iE4hgH &hQ,fׄ]@)Re(??(\f+\ G/>,2F6ghn}﷫~:䚳FLKiw$\Zk^Z+BŽreltirs~rӑ Kܑ GsAYe,UC7 EIVsmA' )gVI'Dds;r8 2OF`Ms5E?>#Вy<,etL{g5I 9n< ߟ6Qf~'W;\ΜހL2(YYړd Fzu;Y4Kv1]ܮ7T9;9iGt*Jn :f{#| _Ng CwsytaLڽn}Gw/yьzq2eEx, G8{\yPñV 1lS?.7Mɟsu{Ûu`Z\c8"`#2Q;gwo=ȣ }8JN8d%(ܭ|*56謝]n,"j<\._W<{lc/nls+Coc#9Nq8 -IyCdac dhжh(1XJX/O6Qԅ&m󩋏@L{qK((M{!q1 1T13&˓VV&G߶xEL1{M#\"/>z]'Qok;@N L% %;Sw.>awh;;FEyrIݡ;6ؒ6ѩԺ`bxlROJZڶHey+tdBCSiBNׁIN2AQ {'hr2idՃ-': IuL1Y'r4d݆jɺҍRbTJJS)vXqt;E$Ry nH.Jhi1VRG"И1Q@*H,QZnf)pqJ F:H!J\4Zr"n=߾Oa7P?a~2 OjPo.ZKJQpA4R*D#2gJB`\{P3dD$w^o>?ZeY7 )Ve 6d [ ) WL^"Uλ^ NsI'n\|3A{÷oviWE?7{!߻_÷;x}~woq2' ,hI⻾<;hM>߯:|d n^w}3޸˫ Оѯ[7߯:{ruFCxfP,7y546&'nW#"7'Û[Kk#ְvˏ?!gG3h_xTcw]slK|-ON%98Jb)s9W(V&3{>==z\jfԍWmu`9qg0bZ)?b(ywIB0|u|rևatC\$'09ػޞ9 0M1U΅3 gk{wi2I/rJ~;+[zC]URu'w!?. < >|~6.>橣nu`ȇAHFwg? _3'n^Kc;]>(G7{; =~ǛEπ>?ם/o/sK~bo+Hћq7"D҉Ó׃aҽnmg#͟߻/arϏcP|_u3^t]ם! ٱv?pmz'rк_#f~HN-oR?ˈBΟ'|Iw;2x5t.rJ5Ӯ}2%u+Gu d+|mUނq m>f#hVJ6zn N2mTEy8i !z$ !ƈ)DTM8 6p l `%{e%y\YrgEÀA>(* #%*\5Թ1 .9On ~;_}=&.鞾ߞq~=]H  I+.jS}#,-[f*d͉uJF\[LGNۖDSȩ^$x+2@VˢtEEBJd6X@;g)VzQa56R&NfyQ$&&9gaDU@J)4:%pRu*ژIL .gIޘIޘI:|I~5>YYYY>5Sљ˚P6c>RS.ɵ䏨CnAbsIv6NF)hmh6&ʢ@QSa8A0]X![L78`8ʹ #_lP-n5FZnNqAv*N9۩ gXs*SsfŁc#,xbZ0v4Rk< 2"[>fC?k6^C*ֲ+R%LG5.U]Ew j'ASM>s'҇DT8CgSrEgH i!3qR,,,WYQ&TBǷn2̨CR0aF,4,7 aՔ`mVT((s$2&VGQnȾ\4A nvW!ʽX.2nYBe {16 ]Et6ıR?cToPcL5ԴPm-oki03T\"Xk :(>1 }SFIKDbnGY<|Ѻ ՋKx s֤ VFX`"! l0#hdGb)<`zQ%N爎)?G՘_O|;i̯j̯j̯U_ҤBj?}ؤȒI%"YѤx M?Bl$S K%8Q#b Ý*w*TsF/ Huj2fΌ1 nBTw&a.RH vLVzJQRY1 _};z>:Azx  sSM 4xVI2 OeK!CuF2ѷf Nߎ^3r&Vh [F~~}ڞ>,>O`{g;9[r2;ndJV+6MI},R},ʊnEZLuxinfcK>`$ @82ʄJJ^qJ,1r:4ԩNI5ZeE7T)dMͱ Yq%_&@HTGW'Yӏ} Z W? w&.Mud.c7%O)ܱ ;%V:643AZ9zΟc[mrϳxpA)I2#:v-)WM/Wتq91B;O`gZ kǴalD(Ј1[1.:n1yfetYTD`2dVs1P+@uXRU*@:OpiDw2 d(+rPdF"jWeߙ^'ev $Ɓ>6\ TREmw#(X[Ba*'&R&42RbG^S LЕ0e@kI_ײ/P[lř$3uLbe:Xi0He b(g"P ,,2Hb sb1ppM2~ ׃abq3 `^l(վ*fD!cF24 $ Ji_#訥ASՃoiBP,&!>"05/x=A\KiMeoͼ5٥-2fF,5KZ2 5Kgq9R;㩤|\1Xax~7$ iK [)XI 41"y0#`@/VB3;;8VUj% {ʴۯ.&Y\1`|o../8Z[`BQuP,8x x=㥩5> ȃ2mME:ĻDexco,bI!&Xϼ$`h(KZ *~}7Xc&> 3ɩ66k ٶxx0@Yб˽"Ʀ1a"0:zE0M4>%ݘT˽FI&D{%,~o$`/X{~3+Fr6 &y~iEウaZ+JKN^8:X3yIȑTR}|τO:ټe)$-?X>Esy*7N q-@y|< cHyXV㢔cnT`l=z*$0ppzɱ2_q~Ͽ0wX3aTrj+UQ3"G8K*]s5{EloS)%{~ٔm6e{MՕ½eR]J-Uw)(eP>qFV3-g%ޥ–f*L0>PjEHjy*eB&91HSt:rs(4hMiyJOoii?kػ%Wp8VwuUwU$> ȮMD~[~%! !jL2c'>C !N9UOtf]l>76 ׮ < cd`bZ_&Lo2p:-mm-VTVv1>S}Խ(-Q柖OϿ싵{tt{BЭ/T_~?蛃R`|ٴjk`)%g|C!&\6WMS0}ch.rY2+ˬeV2+w-V>Wv?1274#+O>{=a4qs_e4쿓$+VOvwoOypiQt))i2v_h7Pn_glm a`:LÙ;{ye`]Z LК@(~g/T'o wS.{WF9)ːv3ߪ-D&EKf2G}ĨAZoBCT9RΕ{6]Mri& b /W9ϟOţGztݟ~o-vӏ5x^_?5XW{rtKXq鞢?2ʇStqz󯿞 Uy͎O{rpONgws)$8<Ѧ0pH^`&M!SE\)O oӹ-(5 Y9@rǪ?q?}VUmEUo?X}XpjQh8'MN._'_}yrk=zA2~@y8ɧ'ܤ2%}Iyq!REf)ĂEk ,I7@N}_|aBu+[Ε:חܾ|8r~ꈞ?{ !sϯ6pSsh`=ѐZTTh2탛ChHP@cܯR ƥ'nFKnR#! d=7=_*׏"jhdë׿ѳ p3bC 9[[2bƦTq.DAɾ&!"R5=Ik9r=hDExyҾWFZ߿??|Axֹ\w;(xëYXU˫OֲR/t{^/CyNsBK>(l^sҫ/Q ~/%.8'ZgWU7rM0`О{~ +<ӥXc˭aho:'_RckbA3;lVgz5Om8 s7:k/:Z.ִ]4lJ;ىoXVEHڰ(%-'ws[骞/h~AQOuEXʐ @ 9C . 4 ,!ŶT+@RJcPslbn61V\*F01=]Js=[(o0g9Ї g(m&jP!rg> ^35.'tL+G1I_m]AIv SPfQ"@u 9ڞ/u|m|-n1 V,塱q3(5A}>A#qxL4S{v&J c IH1eƆrP-gVpYC@cGfj}>a=/cnЋL;]KfRf1f{ ʬ˵ioVJsun+/ ;Xdl1lě nBCOG~y|u ~ؤLrLXR'.1dS!" BM!Y^/ʳJkD cUbE`ofi{Tq8Sg4wq O~ rRC˥ceBJTFq{ULV-TBY:hA ) 5JQ<4HAdi% \ V x,cjx4e]vlIѠ' szw< N]e1TO 8Xzq xdEQ ;d<ؚ@#Jt BU%ewBRc l\7 >ixΎzӋ2 "`>) E3W[S&Pq$"@rFys#j#;]vKl9Eo'!/ Ȅ)DgJ x&Jckzs o伴mߍIL٘;>Sh3*&Wk8o#4淝=n &՞9c,In$t;fPbX\!cĕb@#lo`();>uJjiTҬge1I0e"5H>jNoT;8y>";P 28 ʡ]4$kupI5jNxb Q+^3S@Y5R,ONQؚPǾIV xvKzl7__FprdԤs(OHT&8wehb,btaU&z!uPJtVivm<:I'sR)cfl'EA=6@`Ӽz [k?y~8ϛUJf^2I*ګP]̵XC@hzkVHaX51X DW97 hƘo1nS?nRf_m '`!~J9cC7sBN/慝uIY[h DoީR9 xrDge-=yn!P/ ?~tSJ\T}1~B1j4^,4i؇)ZI#0U-c2Q qR 58·F~K>@$ (킯ܼ>CIX6[K&jJ)7g5K$ӒߔSոs-VSЧjZn@NMVAhAC Be9~adcײ:,J' c ؛F҂߶kk#Ɉ4yAjsZ?W[Sԅ2RZ;0.NLd>p6QL(ޏJM41{J#kp;J xzXd`n4J?]n [HuS,d >4 HپI}xy;W[Ӊjlp'q+Nffr*eJձ)G8.`}o% 8f; E` F1@ Pf\jC=G[nˌZ$YT*>}%)pOQZXp`(~S';ΦGۈJQ2#]Hp} 01ɶ 0Ո]vl~ح`#i|Q3y9n9RZ sp1F͔>@[T5uF}\Rw| 6]E$wQ؈ !b8X~1獝<6Bv:x'z] [V! NMӸZ#kWaDlaO@-=cߏ=vpҍB~wd['zd"he)5A4e,h0wh(a14>kdjxi)nc$4d;|@c!naѤGi v rq?\gW~X8'6BP**7CndbTO([нS8@Ÿ**R`+f=4g&NR/ R3=_( hG8H 7 Y zэXX JY8 mGF?g"v\a=|Z0jvt>kWg`w B6ZeOD@uwIr+]kF6nٻ6r$WYK|${7MMdّdf߯(VK [lF$vb=bX),͵+[<p\Lv/R;-y.،i tta(^`z/TPN~Y>ΆySp~05cԄR/ګ^o-+Cӊ<[a͌G(c5on_4ҭ@,=pmǘ` ׻cmk'FT9燫]+0cz0$ᒢLa&wcv1]-5Ը9չ+ȱTkc' ,1 x rKxī3/ kPVoc|כuC_ LHvLЩ0ׇ0ϸc˵{EiW]o`t܄^OGC_ ,N68o-;>%`Ob9669yh *!u 9K9΍FH~Yto\&Z1 17A Z[&S/;O-CWR_[a֭."$#A)AH!5Se SpKqz'5wt[ pXEEaFkӰmtDjyaѴsºvk\Dr. 1.B\X-0R)k[*5s0c)d +raA5LHcNXKCFN`J'=E7x*g⎞paД^'jwA>>B <~DXCK4cʨ J@ȊKn0*g(%\r;캨6!YN^J-.M$+ Phzn9^? Li[u օ";#XQ%\䮖H܆>2& .1 ꐆK+u)Y F=)&HB F(*^HC" LY IÁ~[!mP )!PWڢPRT2pHJXA 5>P*bPIk Pe|gJ pYk2 UFB嶸6F&cY M4x^j++`V~kiJJ-+ 54%H6zjH S<{+\yDNc@n'P~̜wȿMGry^Sr=Xk|_syW?Nq8{{ ~4=D^öQ'|/WonhF}~Mx &xzw"o(a^B-r?MHbzV@Tvl4+`)ψ氷JePyCipj㓟>fUЬaYbtH0n^;s?0om- gG5 >ʃq՞=.\ʷtb; >G??K|c,t.Qe #Сi#;|`Zq):":ƼT_-JQ/ax E?΋~6susj8>kl1gfc&ἛbmbQ<| (N:aǻlFv1,Lefl(>侺omВw7J#L!2oAlfw6ϭ"88=F^yz )2D)[|ŇCk>oL܂F" 9l"T*r?T~*5QZ,F}ȑ6il5#B5M=^3pY,ӥЇ KĖN.6Qc-6Xm`ٷe.[9FTB#Uc8; ZXa-8=XE؃u_f8qb {uWQjjvYU3[relx 5P/]ET5Wl0F!(ɹ',hh0sNHo,,b4Ei߇wοEuT %=z?,VmT,1*_L- ZV18ړ\o4`).Rq¤iw E)%д}vP"oߨ,<+7ƛ g~x.YHO\]k<^??Nj2,ˌ5ӈnOO\e+?:؅VF+|.Ŷ¸[}.l3.Rt۟!ZB*lPBD7 l&45jN-F"îR\s6=XU( ×ƭ J~^B@/ʳw0tHj6cv{"\ƍwf%3.s4ńWy2~X14|><ޖr Qci8.WK- 1t2 -  `5~G63 "Kc6Ae\(gpY2u#Y?ɽS,n7|u&̚k;SKHLhY !u/z|YƢD8;D\i"E.W>(,޵Ef]L$XekHp)ME!9B!4#%(cp ̩q嚒 Qzq- _Hˍ϶9mk1t}Ҥ:'OZ\<+3fS WYOkɨuh~]Ejk As&!6,w4c,. ab ;RX{l؟t<`;35dx'<~+̖w6P 3ʾ[uUH%m|ӄ6>=Yn>ܟE a}#Va#X:1UOVD(&vKm ʰԧJEMVkK4$1,#جRRpUԲ6B QImpTWbiTdfHBHJes\&Ru&MF$U$SԐ VYxw714>k{A1~J=H0[qJq:jceAQs x\p '<(o>/;8>BWGD:& N@͈(o8SEe5Sl5} ʖqP>:!FDlk]n8 Vs3"ֵLCFP4ǺKy; fo*g|&wގ> Oֻ?mҿ"èqU8+F@9l vvY"Qwg=f1F08%ϦX¼ԍ]M F5sNW LLd<> JZrT֏F r'糧MåV5OZuIdTGp)8MNu,^s7C{5nOJzJ/LIg͐A\,T;[& 7Zk!Nܳ\H 5f\a|ZR3$[BU%⚉ kc3FɴdEusX dw{ūxNO6FiAFS`ٛ9 @4f@ӏҮ9t`5H>n-8Ux8{, Y>Jq9}8'gF4p m6٣yZ^h=텝b|Uڍ^ 1BrTN.&wRt) c:G؋ 8dA0HӠҷ`c-\^ٻ޶eWa޽}AIan^iER#uEkmHf,xNwuub,x: */Տ{jHq<$}$#Ζ6uՔm8%5)ew{cɊ-m{g:RVGr߽U\`Rg:]{ݸxpvQ jY]H cBqaޭh_(yDllUp̏Tpb~߷ n^#]ӻEMɝq)sQ73%,h3HwhU)*o^UrgjnWPgWZOp)gDl+y,]ײoj2HC~;bͭ”!\2yYpB%PwP茙xCYz[6R:V%T{zNU97!OE^W+>Y .C]Hܭ!@6@W. 6FTYBѰHL)EnXV'_ï#VDr "<5!jMXP!s]Ġ) EvthȞ4e(!]Vجm -:,6Kh#PHUl 6DYQbk -& 狧_V :|6OKgI>_~!ds-%HJ pI.uZ>I@nSj,yQ HnTb)a)N -+_BP%[.\~Z:%I!c[G#% $=lfY! G82A\ *TFz2xғ ΔՔulFưY7+\ˆ҆V.rC_IRN̋I5|Hzqc=!콹F];^@Q'JcO(nV :4 B6K-zuj5l}' Er/O=ဢ㽋e;O6=`%w|?MM;=G6&O?T4%Ix%ICoWҽ9QΓxrɚan3 !i9÷%9lYًٓZ9iVV;05a7ҧ>'c9E)qD6AY Ξx2*W۝a$B!e8FS(",%sN|JwF&Z4JTI46"\FR\NJtxWn78@Da\y^Gs=⸗cHfUFTƳd2K~U$//!JP}(w s;%x=M.,VYF3o|*2YKx!0JmxL߿MywAzL_ A΀rzD3! ρF/%F\mBw* 5 ff iɅ4>?ŲRjX2xC! ijcDXVkh^ߴVcQMsi\C Y@n)1䞀1/؛!|SU 1!OrD=_qg~lB0g4), 59:bC B)DF!+iלkj/D{Q2HŨђS@ͥK!Vy @ !@U؎t6Tb]k)(;,$;ǔ]^kw;NsaFcH)}/[,q9507UR y5%9{.N&eƝߗۈ#̃9DPڔsG(Nw)1g~}XY05b5ٿ}ʲ͖H]Jb[h;Mulead4}b?L%q#@0T͸(]ϣi_ z:wlZL5_ZN!$|WVl5,TcP ٴr$kl0H ^r^`ߵl \,pݸ̈́0L 8ۯYgWD2u䨲J]6UqGq&^z *΋]Ӫ*ku,SE1kV^YryUr~E\I}f,&T][y{ٯ/?[7˞N\/Fɷh,q{)LLUbEGU(>)2swz9Ed8Fwɘp #bš\&|QU1B\B\z8a*߆ZU Ž:c1\Q"U?+&g܎:Ƹ9,˥0Qvrm?JtQdjju^z)o< )VO7x5(ocpV^j0o:T!3}'kRLtAF xpryYr܆\N94kO3 j2c.J"LMODb9A[h?YsT[a%t3l`Rݴbǀ,}y+,J 14Ё,0Y!/ }#L@l#v\$Í;r-I dCWqt^Qջ}imY_HK|TPCl?Jb#s1HP] (@4;2UURb=G @I@  ]PoG%~<?ө+F97W~}&GC2)詴53;I%kK zQPqa@\!y?޾~?k5N^ݞzq? N'ާ~~g^9@)ُG!Vt1XMj<KrQ}y/."*K.Džd$%* <8ӊ22{'UH؇J􁸛$.&l=HZPU5wJ,+BA12i-&!uYb~PNB0: =tX\bks-/$KӼL:wb-_K;|3_Z¡lp4+ 64t&@C(Fk8bag UC&ZÚGx~pߺ5#\cwwh9$߷j8(,+X!\W^gm@_4>S4v ,+LUa}7/*+|S/M+Y]t py:cB?FO\aT'}\ɐ ~~摯t"Ȥ1O3)72=U `m2\πBw.q{-.ź:2CV'\/3򑶜2С$xA1] iQΐָP.G`jqpE`ţ[Қ-.J .EG\~) =cpѢA/lKd\VS=a!u8s? Jj)ΦS\A GG  ݚh$Ԓ!pתۮś6=ˊF onʀen$䘐GF[aӽH 9bɝ Cq@ڸ [h !ㄪ^56 3ړ8g^Xs@k ::evmT]shjJT\`|nJlS\M%)w[:uWFfENFX6" OZ<.+y'qxm"@qM"Qlh|FO$(@G$̨͗&[謠n ej#HOʳ*)8!|/ yTZ};_h(h5s$>3sΦ?@zݑL@ž'|yO몯yWgX\@.@Oѣ0 M|5J1KKY2CiqdR7zqRN$QSiőy{Ϭ1C WFhx \3YY2(JrE|*s<~FԓƃϦqo-{i*뫫7Ðq\]Ŀ d~& 2?)57sWS3AS?PZRO~T)Wo^||dźkf7?1|rعk5h@5NO{sz;i4_D+ilpxIwѴV7pO9phNoRW~׳:TĀ 2߅>''x? !}0lPH}ḱ̝s/ZG<^ ҅+.aNޤ* }՟j?鹷n2i Su ?ߩ,fFHQeĹ?wwީϳhnt׏o_G7Q8Ғ{*=j.zoM4J[LL@/9Z2-na kQUzM{V>w'ǥS@k/S~,oJӳ 2^Fo&F8{YƱ30=m&ihKf5U(& rA#A[GI8&,zʱ~m BnE\iղ*XeNR( y5@/(Ar:i)gr:1`6FKxq^J:' 2bg|8kF N1A{$n|~ޑ/ijjjGs-PZ甾xvЈOJ?>} ?/Qi4ԎAeu`qBK'‹p'ssJϗE]5> }0dPp!*%v8br i*Beި]o~k)8蕜a ڤ_z\߭7Dp 6YٶtYHygb>u^Kor]]ݠB6S˙TI@]kl ǐK*&F‡8<^8.Xttf6 z0mvw;pftt\aO_rppЉO{ƧYMl}w.{L2]3g; ד-Q镣@QV({ǃqAˡL'T[;;gv:ON-݁Ø!Y<%x@Q#\q47FLC s|)%8? fLoD4]gv)^^O'1 [.2bb=$rH$zyY+~^"OHI=s:OANlGFQndpXeVJ7zN[`]oE>vjRFqY:TO"(QC5;q6XjӒغtXCfAPk&]0Nd֒7hN9lttQ'zұІGx’=q̉HF򑔀;eDg2-;F JP4L8HGu7gl+[Ol6n6R}߱@EVoj{csf j-&y&&i Dq ֒x z,\OF_[9ѝ޴Kr\.b ɠC$z[|JL3pILF8nxm=P-Y`L67'$.lGII'a,ve d v$A6e2GoHDT8H'@3TWBUN 7ҠwDMh4rM<!GVF֣ 1Mi`xE9^ԃ1;;! 7@nfxENg(r^QkYךIB \(5Ti̷/wF&'|Ӗ|W4.Tj fT }&.{nAEҲIiNljmdz5'j/Ɵ9;opM/)hvh/eӚѩ{:77>m?vG̛)%o򍞡 rG+?'tRB W!]Fv"Z.Mn!;ոlxܗYfFJ=j( s,@roa!P?0zRV5^r^g}]gu>j_|ɉwͫX5{ͻs _e3)UWR)?|]_}|FS;zG˻p#=.8Wo'oP!9U !*($޼^MP.Rs@ A {c{)Vh޼v=Sf[8 X1zfG>D_ _'l FtbY pG}o[R4R֢]j-TtjJ׭+'foUC͓Wr Wr{TW R R]ܴ "c)FJfu`mtOӖ_Z- M~Ԫ^o\z=ʀV*Nh 4zbmHĦF)pQ/5Y%FU0 X?U^j *)p >T`y{ {CNߤL=P4oo7[X}vjd AxHYjF{',CwN'^rll0{`Qxzϒ:z.w;&סA?>hepgRՒB:&+S!Ta fFMjԂ7pAǜMR"ޢg4?L;ow1Im$`eC[u7KKN(G)>FP@UHBgx 0F:%.7wgk-9qfKρi +sQeKlj)rv \UnK~|}=1ZD+aOB痽d/_\_v@uI>i=(#)Y8.ОUa|Ҕ !9{:;Lp)R|{pvW|V99ߝN/wfo~c8}#pjKiJ`p>CxCi.\+k3z RT}^7s1UUAρd%rQkbXnva8f)I&ZaДVSb  9z-AY6^m2yC5зSz]\3~Zg9dYdC|MwpF#gpJ­UTV3ʅ.FKMB)$?"Vibg_|.43kKH[UH//?-Vv1g.t]fD9fKr~rZ=}YUǔ鼻o冮y etcP&Jj(3WPuyr T䳫oꀖ i|i8!qyCiE[93(]7AU#<AY@ G'ZPCրC[,4y.ztIdI 1~s@NJ"{n]!E(*n˨ލIsf6xqc{ɴ}Zf U_\իh% e2 &[d3synS;larr*"{/2hR<ӻx9qNcORrw<mb9CMO תN.(la צ]׫TUE.6y#j (V-&ƽ( x^aZqZu*dS5d%/g6ų/7@3c4sR,kF{fKm܇M9ۺzS!h[KÚsWV;N0-vĉG}[WeD!:?]/xDBQ3Z άpq5%7_*rm!M@Cʀ˪ _?mTβ=ڛ'q9htG;csV{s76B ޵6#Ř \kXd,fp`L0sw~)hK-[-֌v# ~(6H6.k6ʪhqoֲk|ۄ7}c-|X ?UE.|QH#򏍔6$awxXfq$t*~?]ܞTpiNO"L@7=|0. uf[ YT*Hc]ڤ jBd&PM(x|1d(v"t9 ?z؉s#Ɔ_H4 LoP'EEf2`u 5d>?3C!XSR.ElqV%!lj6#/?S,E m?(H(!fi`OpΑe~,9R`_'#;:KpĬ߮@@d#l B"$N8Rq{ 2jPjM|~O>¹8ZBz@Qcp!-Š\!le ,%H]qj`95pr:S&^xDŽ1C9+skxV T_l@V 6bj뢒 SQ2e4ȹw·&3 l-XQN$tvD RRz+N7qCN=Zpa(3ΌjPA0x|ㆀr̘B`W rB[ ɩX;íymOwztQ4C%O|~,tOt&̹S:=9{u2_T /mf$V+EBӀa..>_ͿtNܧ駕(&-,m%o$qyn'O x5y#?pC?u>D{z xkϷ56h*MJ_x=J˲D)FF:77u&[U6+wf,M0q~r_")DK} B֊>'Wez줕Vg+N3 -'Sd0d94)#.xt݌MA R}+@ݻ''Ws '- ೷e%z'MV'ьB Vi"Eъ&%tC7IJRp6.dt YePRAYt"2j,8`! q%1f: V;$X &u,UT,)q0)ʖ:`S?4zBO{Syz0D֝ijSB!Kuf(J*Ỷ>&3A_l<$d9Bև#YjGZ.>w/֓$j:HLW'_ S1DW4`Eʄ)&1{aX3eL!0Y3l؍SjXntpH;O 6ZL2%*IAlOoȻh$X$5,zS!__R,;O\[MɌHF7SZsj՛)B)4b7ӆ5۱Y!.X%&ŭxz]fs&kDRЃoJjqBh'L3Sk#8YwC-/B݌4g8Ɗzi=V@{P6i)*hmgmߚȓ\Hrikmp6Kr6*M)YE^b?Ƥ9W zL[ k%#IU`=gMY%CRLP2[`;zgwnLc nVȽoK, $<ۻˬ6trP|)Y.d'= ^f)qZȬUԶ98RJZmgz6r6GNY&)g'rKt#;Git|B)jcMìw9!r=E0Q $[jY'Jlq,+8,'M~υUvͮΜWXa8 ̗HatK]8_f u^'"(r]2'Ww˴\~ńP&Gz߽T&YkVV @vJp%2zd G)5M$[}0!&Z: :MC^nbݝVWQ8؟(*% F#ȩujvmp(mG AyێT/cиc06L ylm3?%&QTJC UBЪ*86kqamȎ yX&["ËpvS}XQ6ԷHS9wv\>]Λ:xh H}clp=Ig3^PV{12Ф=5WFpWf+1cF|7hvwwo!67GǬc0l1t{= %צW"Tܺ|''}rH7Jn#QB@%ؿ~9Ơӊv|}z ؆J<*Zk7 iZOh]gժZDd%2EDr r z,6Y)sGp ˤOC vӠ@*q&[Ba+mX=H/rl%nε''/'SBR^!)=CⰧںZ[i Uw.ѡUʵCؾϷUA 5Yh˚.N&(ˆw:OԼo?pP̺_Z11v<#/0 +BRX99OoB ִ9֝I(m2rNj2dbB.dCU14>BK]'Ϯٜ6ĎI%:9M3A T *ijҎNpJE躀I;/6yK'5'n6 oT-{LWL뭖L#2f#T՞r2vW!xsA_682:fO "Iu LSb-I֏_LYbycZLcB  *uOD&ÆwtJLjy2^EA8j~%EκNB4}[2m7o3A(&Dwv7-*rVtjʰDMZdB<ˀbk%_n `KɲY51ץ4e Sep.[:>ELQ'J}yFŌﭸ+6xG_BI-S]j*UW |Ξ&+m .uC2{ӝ_%?J 7ha [ ~LW&0/fav([^ ܶ{D 2gZ7ՐՠdO}/^e*9)t;MG B. g2 Z)ڛ]­ۯP/q᥁p.? t' {qUe*ܲqnٸ*޲1|껄Hi x,IP͌h)<O,ŊHmUZ [5}+x|_p{Wc!@4goz /4w_ŒHk Y -ۅu`e]+heqK`jX)AKS$ĝR^NmUZU_>x堊B9}=x7+cn4__E~d&p^+iEZFir@L^3Q_7}ŅcONg>Q+a$Gi`qz0$aH0WVBA;J+ sTg2C 6"(<Ӽ ;0y*a/WSxч@vV{EwU|b nTD+> '& ' hDDH8o{D__O}!_U뾣$ӈ!ʞܻ $R{t2n0Ln^-dKx29kB1GCh#I A kK3@caWx=5cr%EJ/tgb.GamAx*%A`6ё2I3aW#2jLj)JhP\>|߼/#3' "^b,<9tśb#^>|^-qo 6٤p+(Ƽ`XADw 48b1B,S&S w1rA#ّG>G uY LٝytWR p?QWBvgw1A~d U^R \r)tf)VN`~9Y!t i?#1J+jnJcxH\H3@,FZw<]'XTHc^cgtYjNI$$a XJ^.z[¬b^ja)q9m/6P~jgf=5jΦH'@6̾2|h<Ҷ&q~6MqChφ< {r^r|`s{gxL~ʿY9m;鱕b5S&9yM to`3Zh-UC;Z"ĚjuJ&`+ds+x5J@VuaӶn;9+7@X=ivت{P}=\&)6x [ \FwYT'kgc7B`wfR.],Gf=!lizÌBԄj=|yUNzٌ%\ p| f/ g ƃ4 *)1@0R SZPǢ_8Bof[=QTOi״y/~·1_u2Vꭩ)\7Mw.܌0}:ߎO"#70/kʔ8vkY7Gu{n9y!4 'Qs]65ѤnjѸk"QnI㏕<22Et+9Ek)@ 5a2D_Py󄾐SXHI[ H?TܝܵxIeD7~836 t $*e(8Ü'A u`o/ߚpȉ“kU↌(CcՇu(mxa3HIbvxf[?&Z &0[/|L{ Eqda!1I "u!O{.W0V wPMIB XtxϤP=<ƧƤdž/7={B3mOEcħ +T)V\B%aQ褍_F(AJqړNW SݕDcZ)BxO9f7ZdoFpBpN "Y$3$5+4Xb<›/Rm1k-_Fe0%vELd;"f{G)eAnYS?*aG2zaf\4s`I_¸6\M*D=Cg}W5H}}c==1?k?3?z򻋯>>\Foql?ܝ[Q&uܪCho[z2^#*rhqߒfZ2{ؑ)õTH%OgM`M}џs0?n|:|JsPd9~0]4։ׂmi:*R`܍l`%w+ ^F28 B)Ř"Vۛi;IdVaƪH.v~gii~,@jZL*0B$ΝFnVd֭1i~:N0u*R o7 ~Fk&uc,q )S%JUܭtm-.lT!!c8M (gJmƽq4K ui255cVX@#87o>Gn~\jPp#P0a0 S\%̈414u$-K I]c9q}a3dV)l@z3ʵ'(u݌33W"Qq7agލBlޭa+9eˏ{s勁{G+@5?]N.;krTF#g@W3 ,7cЧ`O1s`//~o:]?_}x妟߽R+X`*m4UM6rjaH&z-Iow)/U!U2K(1`P$Jƙys. BFLf`FTPJ#ϰJc,7j*JA ZN (6 F$q̲7 t*$#a F Azm2.Da*m B8i1mdFh2]b"CB[iLw`aNt G.I03Ά@#ppɹ,&͐!zNCYi% P1!^^Hc e> m2p)p5_p#j:Nݗ*GaAݠ=3YO!]e|+T|]@XY\Y|<TN1^>y4 l5 {ҩy?uc |b+JU8wEu ,9V\$Cww֮-߉\< X٘R.VMaiҬaW(e^|7_Á`ˉցdFTkAQ&R**C95Oif@\Ѧh[m$% ~pG8Ɇw,N-c(^je)ȉƧ,2ǘlw\d)%=RRqBM,D(&6N XA%AeGﮥgO JΙplv􌌺9 )\}M vHА2 +G[3vVq215_@?ZVVs- )6p4l7:}*Oqü1 x,1'ūK̷ tUic: eﺋCE;25Np=y(^4Ͽ/#$; ’Rz; boler>)a3LX}B52|"-&ɋ;JTKfPe"[x7j{Om?$2 ]-ӹ_1rvLNfJ.X6$O@ٙ[ A]^kYOo^خt;ҊUc,AF 䐡$1&G:m?Q3e<*)VQ'3w6_|O ;O i1vR߻97`n_>jO*ⅈAN;ŧ+^,'Z_k`) 3 +Ƹ`q)`_c@8/\dkL6]j4f^a! :+S 8q $v=V']cS T(Sf R^Ӗ0+Ut:ޡ,C E|:'c8x4!Le]Ϧ8z@[YYX#!wٚ5l|!+UĵM˛Z*Q Q@q [X~ dzXxȚo+Z%l7xnq!tE詛Rc8fcGh)0KgyZ{bW&':]&ۻc]fٲ2i])v"r!?q.w\0/Wҝ28[L׷}yxtcE\=3W^]Ř+suFYY +ӭť*jRW#15jV XF;m]OFp1%DX<;_L^wz_~͐=b(r6P{@} 2+pUt}c>(˕ڎsL8ؘqֹ!rVNI(X%d#D/ӳC#SNNx1"8$+r`UShQQJ{J5vrАTl `Q8(BK '؝/# 8r!VrM9'Y Nܧ6ɐ$*O9z1 *;Sp6})RK[vtm]2Pk+PC4h|sjYddj}.E[橂SFkPWb;f%&DxmxbtUh!% y,7x٠zyoEpIl(R3K,N\V pMU3.׃1ؑ;&z%fjenB3ncwF#}lcT#Mj\]Jt,wiη}pu츛wЀ=ҢFQ "/([10ICP:e\P;B+xbCkDN>e.)`B* eSJ%>sms^rsf8VO\x]3l.ANZgfa4r{ԭ) p};3Rրkt.5Pl9LN\$s|Nu"tHw.ov\H_+߬[2=~P 6"F)GfF8cs#X|ZOyAyZ\bkhe+F-_kOxɶpPUbW)*B)~Cyk2i?{]gJtp*']nuR? xy5 *LS5l`V::ߧMJ~?{[q0y]~Sw&O! { #DSNk,@wRovqqO=}9 ͷCwD77Cp?p;ÿptX7FC'7%QrD+'ɖFZ+/m<93lҸ|u۵f QVٵFb0b0$8i їROQ+UY+8۽;zrǜ8\w73ɽ9|uHц{YsYe勒֬mSo+m-j,JŠjmAVZτb `(^m:xu Fd8ICA{,uZd@7(e?Gku~V%1S8um+GYih%iQ>Tg2 In1%nIn)`?nW,V9)ZJ&-B8GP@Y.wCX0a[rU##W;'6ë, &<,CBiA)ǹ!ITQ@Tw;#V&Hd<氌EF sqLRxVcaHpC!r-5^`mGFcvX5uRu&A= 4" Hc8gD.Q,Xb }v ڋ |N`sDۋ;$IMEdHn#{ 8U#{A>&{,ozF^αh_|-KVdҦ}k>U\%qy@7˺, l!qzST ɘ&tWEp$n}5gm4`YSW!8H'VS/NE`'D56z%DֻjCT(bϷMx#ǃa9&J챥9%A#i=WᡘDC .l쨞x?޻Pؕ:,Ūjr)jGe0)qcD.HPaZKk)nK :r%6'J~<*-jѓd^\`u>ۮTU:/{x33؞a\dSf Y) pimpVL6L/SvWRST8mN{u^ҪWObz=[mR`YH2%/$E"Oܛ[3_D!Jr!1ݻg% a-jo tDOa^$-kmS& X1]pdP{>}(4v]+Fg~ h $HbPh%Ɩ`(Z? <H4&ޒV8j1[%IRcA ӯyeIBD۹eH$UBjF6јmgF/e>Ud5]J*VJpE?]H89@c*.vA81nvYJ+Ry{hcu`#}t I9Vwu4*oeT(ganJ#,QlFGDpp6ckKB5)#~&&ER ){.-gk.['Q[W_..VMS~lMMuVqw>eׯSm10rMEEd ݦzu.qU7{0^YW7=XVx:E&Q'Ycc<18Н:H0+|K)Up 3Uσ1f%1`c0 [9e8fжĽQ_èP,d;k4O L ¥f7_n t5c|N H_MO 끐8US>a BFVԟt,rB3d4;XM w1OV4@#8YVQH;/*)P } aBP]6%")xzG$FOl>H4=G3A;ϩRQDJ|\B!w'A֤ë~y5\[7P{ ^F=WTݛb,|Z(9flK*F i_J)$N19)vݖF),k!#Tk=J(5,کMAj,JbqքHLq['XKxsvHuRd WX=՛'xy~ExO''-h4~hDRi0O3n4@X1G_ÔIS+˱K +DP.4U7?% 菋ɇ+F9I@U8C h ?1'ō+ Lm{K>wkH+4EK5g!NµY5A>S 'wf>5-gQ[ʡͰ?ٝ)n-WܦbOȤr惓\ ZaFTL)n JŘF&;fDIxW6Ȑp?Lp3onj%/ y#2iK& `G'YKa+IO>R *H!UHm"L D1ª>ua &Z&+6"|0?4rR?ǣ'.ʾa<|~)Xcᤂ{95R22Z9mRKE5Z)Bo+M5LpɅC;nODQ `Ț ꏍS_'6:S)8E jR lrA0^ ZvKSMo[$k|e(ldJLn20q B?)A}"ds%Q:߸\rݾ]HIh-1YZGmR@WCE.io|JV2d&3<0k2GS,9U(3!K@lyRg ŰW!61BAXBYCrLj~O(ۢmcQ`yZ`Kz՞bǰ0+EN3c55y>R ]ţ#XACWm$ 'p8ö ^# x82l[C0჆!c+ X==z&c ,Wd8D]==XMjp@n5#JHv2xo949#]^ʖS^{eg@½2@C%u(u z+s49^ȈZeXBuTK(g9ݎͯa>I_T HqfO{T={S[]}_`|"sw\{໏.]11zvfŁL.)ev`2&5 ^P=g?FJDZ&2֑ G. uVZ[)blBT!r?VFrV `K>%nncj]94,)J9=W5L'se.L-M{I2e_SyE0|0QhH]Gɟ@L3G?-^6vELN $2aHW(uJa]DˆuJ }Gb&W> 9# ;Ek[O~%NIJKug6V@i Cѣ8xRg61RH1ٟMNc 14,[ jATlٓ6`: \y)/kh'uK8׸XpJAqR` Hj\Y>ƩP0c(b%@H r#5W+ \Gݭ3~x|n{sN[mߞbonoʘY}1yJAѲ^ЦlȶtQ11Vd3!2h2ܸ"HJ)Eilsz鰁$޵: ~+^вIIal;E k1MSj?{ȭ/HYs͋Aİs8^wtJt9KVJi,ȅSWrer'\Z0aˏ*;`5 dJY_E;F}sz#t~W99(g`6u>sLR 2((eNd.ϵ&CN˒)0f. ɱ,^qcE{UC[Rkc̹m*FFF)&-M 5,DTe>3e:VF[5MSUx/ X &͎Ь@V K/WҔJTQ)8J`.dMڜNEU * <_ Zʤ,)V \eJ7esH YfOmjAlg/EX)PB"áxV]xIQ[$zx@_,ӣ[dC R+V,ӣ=8D1Ib=Mev=G u⪈jn'w!x\{A!9ʠH;8 3IŠ<9LaPH ?"$ !G}A!ەUuG}0pdHBy^뻀~GZTː!|Crz,ۻJm34[ٚU0')r /[W:ɴT髹`NK_嫾atđ kcI!gZqk]Būa@kjM@e7ߦ.g%Q %ov"t[J{?6hf}lH+ǧ;fmc/ =,{vQy!T~}n%ھݢa UCv?ěDž[fBVWi;XQEv.ty&<2&V=o"1*zSՉ۩Xt;"~ ]+HeDB;K5"X1 Y)RHcڀe<4p>^UǺjgrYU txC vdwZaqhBC{sӷ<6fןD knC<3ƙ H>xS @mQ)_]J7tgӳRPU㶟CWCfMh0ژ>K*+].܊Of$'Y LJ{׆dPͅT'XhYj{ϦMGhx Gsv:RV {}eѫtuڌwyP|}MF_%ʀj&N< a r}+n"К~^ a_#\o}?ex> TOzzta&c|{XGXSQ}p*NaaorWÜh1w޻BSm,pvja $i oZFeJ.Z-<1:E'^q7)ɷݲqeظ6$ߒO8Y9b9dq[6Eq?\}J`^I ؆*k*Ph%$L&6B/P>ձO|j)0\6`,uF(u+p5|HF4[LAVEI+GM"ym/Ԟ(n .2MH904ߎ81'j%G3K}еLs,10JɔXΆ0J)>e #M`F&#0Q(*i 0i°D@`Tщ &#]>DcvA%V0Bp`WI;nRxp`&dHqi8OAI깊@%`pv6LIICРt@)hYmdZh 9BDiu03*C%f@z*3[<}Ǜ'(ESBB j.w"Q!.ѫdMPɰb  iY ЉS{mxZ/c~IxFEs}1&hg8vv&)&6~Ezf/F`!j|z-yɻ!߯>7?IBӍbػ7Kf߽zq(MW;P]ݾn#>ҽW]:R%ו@vS?_eђ, 20_.}ZZ:=|47␥|x4/[jr1T#y0A^`ԨJkxEy]Ytge{ 0iLrSHMڻqB%E9L9BrYP;PUZC[ UY]HHҚO҈ -{[K%q+Y(/=!L$7*HmKB&D$F DRJf. sC"TV>͹n (T:%`IJčP$\%>J*y,(&7*J!F`ȉ6I&rkK8yJzn%BC&Vv~0@Dd+tIp9֒ NjM<Lݔ$H.: :ŒFUT8IƈKK1O&Ou!` j(aO ՍS_bϡV7@a 26htBD;Ÿb:ypO?bz߮.:'pDw D80)` TQر JaklD},aP3w@F,s~Ň[E~%V~W78WWO.du%\lWro]wJkph8xdS{)0M( Nibo4)P Λ4OBN vvHZ\_,J'N4XR)H^IG4%*bU1gwou0:ejۑ[y^1"Z^4Voots䦡!?J2>9)܉Oja;µPsǰ cAP?zt/N _}\}R-jCE;C厌lݲ^ ګ =5܉o_aY֤A;Xp.HA0W$+Z3]74ð^%$4H)b%_/AM(+Sga)%+9Qf: p$Đ7hה:T!j ĩ iG5r(.yv4o|Op7c>GuYN"Q`"-u~^Ez{=xtq:[Wwq:GJ?|`>0}SieFu7RHY9uDUn0e`cZXPyQ45U)*TII4}5y6=O'?]~97gG !>)cs:n@G$FqK?wR{h [6 zf|Q;s^f&\9PS=FqSzQ)lNJHC]bK-Doir#p <.@=qk?\:(`bܗcҷ1(O6sl`#"RGz~1e1gq|Vt^4S- IƘХ)D5t.49za F{S^Ӈ^aŀ g#bZ!eR=eѿ}n<YZ~=}*;~pzIm3qS̟ܼ° chu*~Hy"lj`/OtTw uF)c]cKgPz@ H QR"K]R_>҄wřoAW8O;{sKSTjI QI%K"lD-@eHRgQVU,D Ȗ1/>ټhE2'T?ZEbB(e|gQ~,'%T\mR!qH2犠A3A5Qf n#(s.8)$j7YcqY^gʒafJTSaÊ*\$Rw!4oxz*B;AA!%lKMnz[?*xh'.wN1Ǭ W?48mQ%,k* ]+9gRVmV^2V"F!H-vo#1$Fl:rޮ= SR k]Vc@ҫ]\-• 6vg%b]..)ؐ2Mo֥߮rc6jdٔ9) +qbw&A`Ҹ8.;_1q &s_KsLȵcfcx6DR(e0 Qm"X^ʌ)l!Dm-{!@}7`;'45 E\KQ,T`k)$L}b=y>K<^CP@6gōɷAi m&8A+H&]Z.(V#<@>NVSyk.dV[u {ӼŃPNcz6nmuՕiWo!0ۛV=A(3[9k `w_ŷN1rYb~膄U3TooA$$Gar4T)-y48ȕA%D䪘SZ4" +]aU(KVG7yv?J[>pZgI| .z\֊643hiuZթsad˅Fwx1:?%^ Sլ~JZV{iu1 ]H2&b{9hU,rJYNxw9@IuI^n 2!P8) B%-OzH) |) C2#=4Tz<ֹj˖ELL"dL xs cr ?@ Wk8x H}~œ m4 KoIA(ol׎cI88N!"'l"_@…'iU@vjŸTzlmu0 KhGz>Ɨ#||_U1<żZ/9M=Vy1\44 SK g6sW"~oȼgɻ ȺQk/&{jLh n]Z7iz>Vd@ Ο#x_0CGs6 y7;tʭ Uy s GN]& " v̇ԎvΫΩ:BGx0?/1%uEE~#4E?F(7u֗yQt^m0*be}1K,y0cͯkF 5tBh|^)cq3Å+n6cEқx4Q{,*||2N6#G _GE_Gk `?~;D3ͅǿφw/j{^\|x_G[/`a08߱?yُ^ד?{_Zƫa6x7YypDε~4i!|~t£i'QbsܐfTp"=/rz 韮y;қ'Qq_kio~wެ=(MC3үuSr( d|/Ca5 gT>{[$/zy8zVS{7hWf}xK=X4pgzXA.Ds}q)%xz<=$>x`Fߦ%iBoV5h8GT/L UAoj-\x7JjFӇ(OhHc Zޥ?|-C,cc>eKN4|IkόʹN?cW!;2:.{=VbL;zGe-oG=U$M.glE<[B[$T1 f)ð_[+FJb8d]&7 c10{;u0ou4wk>%^܌ڥ6駣9 T{ ̹ǘ/L<5'.|0|FISycxA]@a[(cq˴D愻@ژ fÄǑP@b`"('(.lGnTspPA(t]hK;;B@sv́r%>%@ݩ&HOwmm';ЮSM6{Jg_Rj3%(%OcHC0̦̐"1ƇF_FHRCIoFjkp$s7uhV2iXƉ.pf\<'W9Usrsz̸<&1Y$V@z`Aق+Ω3ɿxJ% $/VRwvrr~1DL"Gc4,ekܽN6՝HMc5*XmU]9> ^W@U.z.fBO:,$!LTL&N`8V$iI`2i GUbX1 vTµ‚2ƏQΦf3woDW\5 Hgp0͊bz`{2:. ɯuԫ$*'IIRIRK%e^(`T&\⁠257'@% A:DD˯ڎWm3dFʱju!Pi*^T>zEڡ? QX[-3RC%CwY/߅>NGӁbC=@׋vGiv~Z0 9!-eB=WVJv%Q">ṿ'F(R! 646kڎTm燳B 矼\q*8ߧ_,>t[?} o^/}GpR&xM"3-@,"c8m` QE8&vȊ^ᛔ@Ae5PTn B|jʒekM$tY#g`Q`D Y`Hb$AJU{HJnįA:?\NVV3f+E/LgSnw)zp9o3!L~]m~=_-f7>FHȏOO_.(j|MNW/}qK 9e ,ؿ!,QQsJ)!ҡ+ \Vʠc:z g*WÀ> Y{^6Щۧ[zg:oY9\^;%(9UʨT2UB{Yr0g`So7-M%r|=@HR}vviTݐ(55jA*O[zDQ~h-2ʯF㎗dY%Vr\IV眸t/iXNee$1 U@gyqK0XE$z jmzjL(mv8CeK8Uhwr)Iy\ tӤu7?MYZL^\oF{"R,xT1+u8r.P3 =(B?nB$5pfi#|c;? V)'݊7dtUC%"XG~aAXk" ~2f2o2R5~PGH03&'FIE~VZgNB:~sz  ✒-S6Α:Ȇ-#pL/|St,7 xz>JE}[w?x'}El'dă@Ӄ]Ҩj8uJg 8m3(DM\Ԝa8gdD Z ɣxE ͈$ ^%V)*%j3t,)zMOV9r.UiN_\NaE,YT z4! gz_Em?v~8P-k[t5'[@>~)/xD<}e@cOV,TN:/6GI-&l0ͦԨmuuʵXe>$7 nzw_G%!b̲z'Һ} DZmꬓ >^tCmϗ)Vqo d<@n糛/^56pA!E{@@ˮr)΅Kr1z\L~wf~z O'}>Mɦɴf f)ޢ_([2zt(mG}nc1Oߊc]Qdգ[CgG-FÆգ'}پ`<[c3l[%f O!J$=bI:]AS-ertӬGo=8*9(Tfp|$wp*0hGsH"N܈E+.z\Rn2 :}n5p,o_$=z"aӭeH z83q30@Zw7U~tP>NrAT˳(\3a E"y"3i>"jƈ7 T¡ 2\'Y0"HJ!KƃF;ϑKUnStu!$rz:"LX=gM0\PIXK:+h"%*6}?rd-P.yGkC npɉo~^eͼٵ+,ROd>`-?}xI*˄īgT/=ePŐuSZkv1@X@PJѳtr #Ef(KJ˷Su F+FIJJFzTv]F5&OG}-fvF3πz=$0"3.am6c*ͭ DP^HI5)6Qx9fz4gfoqI/У zslicDI[XŲZ;L6߼WwdMZpppzzuSY:)j\z8$.3oТ]ͅ4H#SMdqk";TV\l^ Af 7ƣFSi|} D7X^suBE3eEWP>^{_n(g;3P(za*1'JXN~_96vOW s=ܳdt&޵rh O)jNy,ݠf~DVA Eђ9Ul4V=R<%d:t Y9¡sG3(xt0Bl`p8\|@a^RRNֺj3V>{m6b?Lnn?mYV '_ǫV./~/݋U/XŪiു_v'aU5dhEC)xE$8n)1 j!| j5"we=H;X̃n,ׁΪrfvu,U*SʤRL1ƮBGq"9gȔ DQ"HeV;WҘXg3ی˄+K)wlC?xR# f`}/MuTd_D8_wU??p;՟4gbt{D#rrt89Հlz( Y)m=WO`.-:b]0<1eNܞ+ ͌T2^LR?EU]ثJ BVQiXP<9 ϫkn?jWR}^ɭE^I0!jTk# B*J^,Z鯢7JVs0c9)PN3jkӹ>}7iv F<0wZfo6=T kvp[P'`Kd)ҟ{TenU2-lW] Jyn_20K%GwEi7c*gYR!?պVsҲQGnyǔ!\+d}ނOU;5ru7>~Xխ1 4{=f{zv>'=~5UA}~O`xM*VONf*:0nE'vG'A7Fд(i.GwQ3ҧ4P$f]^mQ'uC~m6X`  A-VI?>H! I i[U}qR,$aܦIQ5t rF<\>} 2]Sy6Wź9Hcc> ҒeݹisE&)`W3e9FjUZMwȠ$<2Q $!2}}rr^gL!40F#[Tmtj{dM7W_5-C50ܷMLj Fw .ñWBiWzKnDwԵ HO01E i~cV}bҵ[bj&8 %ge5>|PŏZ>^lR*eE[SKvU=eA|1K]ZP7I?燻={Ο#]k|@^|;q\Yy5ШBfKF<rN9d993E[Ujʏm-t$3F4XY4LZZw𧪈}Lُ`H*:w֩~d pdlg u(~':CgN )3BZͼm:Cma$ncwIG]h_MWA[i\"dw>^^A4UTq_7xCdÅpjr4j690▋rtŀUѩYeXi9SE0N^*A/hd%b`;c]#wvR1OՎCԛM}rN#[ExގkvH99GΗu[|=X1|p?ă|0^ T_s^sD4ʵb}su,}y]pO/~_f, blي8J8¯|! s܃$EQwr+*xXq :ft ^Q ,~ՏwCq@ l 2]l~z͸퀮/wjs3:1GjgK TB*3 ĚߡVH99V A 8ae K҄X.J$9ky.8ut(2w#|gʏG p-&,f-E} y~g^&|!cgmto#YiIMՋ\SX$qR BEZ핿s1󞽻 p:ɋ4#fP𩚦^t4s"x˦V&-FL9x8(Yj`a;޳ft@h6TOrq滛kΗ}YM]{=[ꈳ^F8+ԩ^ Ȉ8Y^Kn)Hp/_K|i'.f;P;FjK%[]Y+d&Ӑ>WXm"#&vm#gque4G@uF~ޗ}DRph?=ڽOJ׿ϳҀ {[k>"*2AЍ\pY8+|] :M%T?X,N.  ֜prʥt4Jdr8nZYjT]*Hsb[Mߍ2 rI1|5}se(pKj7V$sX_G.8J=;>˙#x@uwv]:ޣtb7'?NHhK&Wg."YE7u s.4@DFvO]K qV$mh<')o[tu_6EijK$v = Z-'j8K50I1v2`n(^=7DV*O@/ %lrggBdrρE>W"wD@R`"O8D HcD$ Ku/4.o'Fa;dFA3f9#[ *f04z6T#4dO~^b9@4E¾0>C…GOO2Gm21|hyc@؍{naOGm?|vσb֫4Co6>tcf(%yd^eɛ gkE{H} P}xTAtT'{6Noؾekz#%R{e?f﬇kTmvb %^"J[ j-ђKɡvڍ:Ԓ8|\ߤsU5nvZؠrN`t,Un9w5mFFB'SΑy XX ~3ԹSaf'ox/|*7l\&/0="I"W Ah('=+ir' 8;hbBfYDYX XRf9Ky }¹nm GFv^,4# JȘ:hac͑_,6th=Մ8b˖oAԋw̖zv@DK=jiFKx8i>? eZBV{Y +BXF_Mźx66xAr>rއPKOw5BvQ&˾N⦸ݼbH9:l>n@C1@Ա')CjϬ]~\h |:- bѼEgn!"Wv|hM/?^lr72S {f^QvcO>NU= yE!6a sF+S_c\?}\q5(ۼBLe>+ Yqu}wq" n N CBZwccHE"l =`d V(^d&IUH$D\` INhˌkCEp,`pŊfғ-=hH΋QLu;WOVn‹w̒P2(?m]W8wZ wqn޸w>=9~Pt $0yt6BŬL09rΚ[۞>md&3< tCSVnw.oŇ!psޙ}Onb^fs׽=w|6 Npy{-M '4{ l YnχpL"ĝiⓛ~Vo^"C?nI۫1- Y|f|iH8jߩk-M0xPāE#u] =~Q#v={L')E+*Xe4XԨyS;QoxA.H9E,Z$x*jڑBOffʩZmEݚ ]ݳX!<-ck{&UV>CIOG9RG_s?}sYk,mYyf'ﺃյJɶUDUG~_(y}G,}O$%9CK43*qyՇ? U9qfקK/J]&E 08dйT$2DBԤZ-_UMi>).HQ8YXi2U1RLZ r MY"tJ NE.yRxJe9!QelR@$ǚmݻzYݽ5Lx(5w$KӅVhwխy|is/ F7)0U=jm:yFf Cm<.#L&,X 6g?n%2nƗX.ͨî3&oiX36'#B-~2AiH*&z͟ҏdѧ 'фs{pgih"t+%mc#$tTc b=bA$2qE!bt Qz$h:2*Heb&p vۿ(]7-6tZ|`*@0i ƚO,@FuǗEnɖ5bٳێzInI [It!)&B7|pe_rja.HZ9$6񸍄KKrD> &"1̩C.v41X©:/˕1 xf@ǁ9ZLARr}C<6Tym .…^w%GlZ(Bd^CNUC@ .@+Z]pcI<֝cI<֝*u@`eZHC"FC0;p^Go(RlPZ.nOb${8fHۄhd˧uLHbQµs1b7YVboP !J[0V5&#Ǘ(q(H}dym#\V Jbd#c?0ѥAjCm1^.+S2WdL\YMrBRcX"J2@Z!MDWVK\S' z,Or!r&QHH?9υvJGpO1[.}dU7B<%<4.$4x^#̡tݤvN{·Y+R̰!m1HH+k!W?p܂4Ϳgc*+.ԔX#xLw47)]I>3*gsoPܶR֪kAn$JDۡ䓋~=o'3׃Wq9W NWNό?ppl鍨 b#ꡜBG¼HnrJrNwu$\5Bd(Öh0IQJ D}"ἅ%):h)dqRz6Zke\݊e4eZ"Y.&ڒAFS9!fLo98B`w:@sѐHXEDgX1V2p53P}MTP3XQEX ֮.(=6uhXЌl,"tVATR(oؕ@ 9+2PYeVDV aȌY .3`ٯ3Tۃ"r9\Y(G3yQG0’TXqXIB*`o]3iL+ ƥT:y^I/ ؙ1X@\p(DIc TAJ FJ5^Vɦ%J(!TGr`#j \TH=`׆0pCAa [DFl0'lc M;#(b9ZA\1Q)0Xh#@ 3e84vQ둂a a8:4"EC"o0ZiJqC%Q`[-8b5]guI! J 7'Z Wd]ko[ROge4߁լ $Rih]1f^IXaEFKܣ|(SV^c܎ܥe*i{64B. 61050/vp9'n؆h & w L52 =~x ]t*kNX>mKS0da|d˩t qȜANAN\l:߾0Ѩ{O}-+@{ Otϯa~<~Il)6kC#וj-zgn!b"az scuxchv<1V-Վu8=auWè=AUT Oa9 gKdg /o{'+q> 5m0c?>py@eΕx%R2>9\ƃ 9l>,f]PJ0\X4z}t&{)N='K~z i?J:.e` LM!Xڿ3|?_Hpx3i_pu6SGf9khr]|\t{VzWLsnz7pgx_YU>6W Ix4K*.UklU N{1`(o\?+֔?Y&oE m0c7^WEc"tZ:ևi~x ~y'3^/fSa;}}Fk^K;G2×ܓJY"/>")Q!oV.Kt`ťk+AK={Eh ΐQF{Vh/J srG3B}fܺ3p}?zƪw,kN^;q|Վ;KUp<úĝA*7r`R`FabYZdC#4S'g.:GdAFHvm #Z$׫_OG2%lcvRnm)ߌv>,擥=BtXbJta?}l|XX}P' e^yuRIWyU))I+c3IEFL(A(4Fત#.nC6d<ӄycBٽ {)gN_CBtgn1vӓlӗ4y3jAQBkn=:LsÉI= /'`bvc[o˿urk'=5}]Vfs2kUxW"Z"740AF!F |T!ÄJA) tˑ"r9e,͚[(S*P$m4U,Il"IܰO>`$dɵY{Jav!fFi:*+AS?\<;r(+ki4FІyJ ;dJt&zo~Zvc(:Ė&7>z<jY";q 9N^ôJ`B*#ݮXԜ4]BZ q}2TZ"W=\I˅xpՐ?NAkAydj` R{Oއ^ _jRG>)k^͊+춇?bKK)qc.K̸."4FE?aS# zse_γ7.kۂ*8O2GҳGjջ1|hiƱhEc,U*,ݬiH#>`2,y,* 9c+_:SZ'{IǨ0{[QjV4(X)U#DbITicY=BE?9lOsHRTĴ:W9hVhOo&PApL邶?X4WU P_ h: ( p^~#-\Zz49[(OR,VCxhg3K*HǢ &<2RcR'Hq+ŷ&1Wͻ췐qIF_UMa^𭒳Y Wid1RJNk(~CxeivRaZrj@ Lo#*lI ((Жgb5Wjjk=;5a0Gj"H$R{mJ9"+Z;@f^WТv\a}KW #bRU CLpTr-pɈ Fyo9B{L2EnZal"%\^j.#NB3'뉉&&/>J;GchGGQ\"P ́\[ȭ?J#Лl%U6`Q{6CDRbJ/ h㣣A|mG![J`::%g0IБ#Q3##^ d"M-#Iu3 ;N96$w[)(52lMJ}{Z;l[wGSiXxcImǓ=k 5GXteui0Y*/To5<Z&ZFTn9a5ʈqtQk7X6$qss3}Jo| & Gyb낷l5d~\IYKÿƅa`r?{]Pj;u]›Sxp >^f-'DǷrH!_P .&.Ή+stɯ؆k7=&CqVNao7> cbnCnv_kx>U#agSH?p:R؁ Fr: h\ IW33AI^.&aR=Q4Hֿ4G- vp `~&&w;;\o'?ft/F+]b#.ʟy. 6|?Aaw1<݇-| wmN_C߶cڸkIN쉝v:M&T#K$_L )u@!*b;I$,}o_ysr :8 uCFܐR 2G' XHz^(DU7,dLxnvug.С+ U0!q>w䷋/go_?9O|&9?Y.߿=]w'^ɟo|诓7 K'Ϟ^^⯓_nޜ~ؿ_<߁;}ߩ1~Ѻk '],-)~ڶ[wڎGף {hVߐ=9^/O~>?~dz϶{'!ٷ\qTjg|?h=c=av)P-a"STLf'QNC.EyDItuL.]}vDoznSb,? 'MM=d_u-mD5v! O=SvSOTy˳Ϫ6Wփn}/Est wN&%"iH_]5xVzl'vqkc}'O^=U6"(qnN_}Qvw=u_J{ Ąϗj4N|RM7jQW?ŧ 3]l㣋'*]<:h(m"F%FyGS}QEIOk:t_cϞkc}xIJ׳;)Kty8;ʈj;ĮCc1}ӣ*G%> RR-1Էi'x>c>5#;Y"RUPע)i|OXݟꞴ< ܘ }= {ϚYzT~Mpuȩ> 1J=C'߯#߄ݶw ~$&pƽƏJE(yۈ&-j^eOu me+g"bK",8h'x@zjh/"7R |"7 ̑@LcUtQq JL*5p$R?q.qeEhJb"$ S|8 М 7cUGoM̓V 2ܱ|Uޭh-)|l !;;RFYYΘ` VYHk3YOG!ѻxyQLE]d"(E2KLodxLWXxZTzF[(#8. {jS!>=/hjS9G W88h4qveWXN3Z6)=+B E{{fLpB{L,pMk ʡy2a#b)#Sc Q#}Fel[bUFfi(uJU?東IMO,8nJ1u;*%\~DvOp6nә}mUՉ/-b?Ef&x]uԇ $C Kd'N״4)LR݈d:Gp@vZ}͋uVev+ g鳤Enөͻ7A$A!8OQcT4ypPοck%oDIzlBޓ_^zCXq.3/6Tƈw,狱DR=j2C6zqٽ(H /gyΑl1EWatp@SgVLp,1?",8q1ln.͕l PT C \ɘ'^e'$l_@Wb <8B:=;Œs.v0 g1iΕnBWY9&GULۂ@9%n q;Bڹfƕ `h3>"ak"^hJE2m$ӞH{Nc&a@W-V:7g~DYݶzdOIh;[Z*qmK,3E3E`RT [Պ^_,2RL9<./M, af`i 8,0/Z̪\! .+,w؏n DA|*1[#Qu,)x4pQG.:*!Ƣջ~o=3FY/1yZke]E[,Է>G%*sl`ϛ '$0"D: 6'eE rǾs3AdK= SSe]䳖RBW'Yd܄zV ~s ,a(xpm.#վ6Rt\aAI0X:"r'gZttϢ' `+P@g~CdtFH8` fa5$ |˵khNV`Y3a2CM S; Œ¼ `}P2ͦk{1G\ʬ/.(.kvV!dAh^K@lg.QOT?%,Ix3,Eq)Vܒ,.eg]Rs^E{%fdtZ /QFs>jRV%iyW ɻfލ!:Evg;#LK?xlCK }Obvgv$i =EKm0(`+oKb:i۷Al~/>z'76ɟz0| !QFMb:.Qf,CB;XE U95g+b©`sN!,  K`scasBJQ4(cj[FV"~4tiI,lո`8g:.,LͭBLf#Rϒu=蝉HTG/>8h]/ʝVF={tM=@Y0c a(驜mOJZpʁC[ρCj`_O<4- @OSǖp&j)UA uD3!e-έ%hhH)LjLK}ӝBq4l#n.!D8j,xNaQ)SHN8 @E t4<_wIڼ9 Vfd54=cĪl9TQEi6+؊)zכ=5t /}n<@C[׶*+yE+5u{u?L#CH}jGOCHl^;Û֝ 7G5lunz~\qhov{p/Ψd.'ґ09WrCK~{e` o# DԷG<˦Y!i{}N^sUvcSbnc[j3$H9DWDuJczJ 3 2#Ijf{$BlD#F4mW޹Ǎ@n8cI0L&16 j14U֮{z>&GLGv٘}x ;A@ev*;4&Ede~za?U^.w"S*@1Rx Kut=2荹^;nxYjJ\ƸU͐Lc9v}`X%֥K/Euj !3$P]N%ys rɫKa띡8N0阙|fQ0Q%x% PeƻʼrAf7f]J5޻.joca:^~yf2mCdnfqʀC4 74i}MZ[_ɁXR=DJhs^}b02zVq36q ;R wCܒ5ӒKq|L?vaX+]-Wʷ=x!:8-Rbq` |]p \Bc~L WZ,Lތ+ԑhgwf?wZi\]ţ Ul_fp)+G1;nҥS&zJ%>dh0Cey+>e&}ҖMf}9Mr0soLn -J|pQToK=uiS.PsCq`ҞEImϯM*A6k@4Ц٪WI܍V]TzэYuI3 V]i+nw7gGvDZ8"}b'iw}jzRB4>[Z[z:\/C]RF&Nۯd."}֋JqCvƖ7rMҝw52<"Y*w67-!PAs;jvHsV^=UIv-g蔻S!tJAiJw"\STⲝcRE-1mTd;wMxRYy=)OJv`[bץݹ ]jMMEGFwy{PUBCpNcA{Sd{=1I5hloէOw Q8;*f}ȍ#;@mooPv2 Q0ntZR;8;J(8:ǦF1a;䶟Fx&E :b"0QLjX7BZØJJLwNًpc}K\vvv:R`ӂn[ㄠre rB#%Yg4!Xȩ`qmJ1c ZWhRk"S9X8=bE3IU BszQ:*\3zBG ,U>|[j('0"F#ca D%@L%̙b\o4\Ŕj¯R-M fF"T b>%&pg 26Dc2j-oDM6~Z.R-1񲘯u{b5h+H)>$QPY-G{M%UKSXJING> B6b JA%Fl那WWldtS$+E(L&'إIC "i @8uݍ_aEw{՘kٟ`D:vr[~O9Aޏu7͕ k'WtP/4J@$6j\!Z\7E$h ?tio%,j`:򡂹fSGy[%>5OM.gu|5W0~ 7 ^i맟6I3y=z5 ׏:߾ )w36_|7o_?{u/`} t?8o|{˧={oV~:fڽ:? ev4+O{<-SWhK_-kf>`Y·*8O/{se$\j=k zydڿ \nK(4WA}aRTӂrt۩g}W54F0#E^WSyEWgwTPh4[-U?=% wl0}Y6z fыj<cro뎾L@Cwgd73`߾c m6s>5(N7br߿2\U?}oX~;l/wTvCo84e2 ʘ!83DLzk(5^S&hEV&ē[rXp 2|B{_4g0+/HAsߥwҊJxaQ#!^=V~F%VZZ?PHytpRA`"#F""2FJ 4N\R+oIogOdVNrLB# l/|@U)GzZ3ȸHy2P EbQq(0N,/Ft\bxGTwn1Њаu>PnM~uw!6xDv=efRyʅWh\.mpa5B{ƨس&T}T-|k )Ǖi^p"͂IFsto; b QI6-N^nZj1vcʗPPG)Ҵʗ;ʗɡ3||jB;7}Tr$s FGdh$H85'fF*΃1"FlZb0b;UP)R@OSJsA XXaH 8h&$10t{54uHr./aaYbaYbaYba2 [MqfœeeSSg֏}}o圞dddd pUzJ)Vp@=4gh̀pd1 N4DX_ 2(HKĚb($4ŁseB 9Use}qu: CfGteg_&n*ؒ!Ib0k/{f[ϋU҅˚nbsy=C\Mo^߼v1,۴44t/P nDdUĨP| Ty2[ȫbhd;’)evP6 0(ymf:?O?=R7$Yhp5#j8RH0"$8JJ@Nҙ.S2}%J65ߠ57Yw~+2ǙΆ]bb&;m{ ոbF3̲,LfP]NZg䎟)ۭI{r3%NVbޓv kTS?wdk !4)+p aTVE_kFJX((ػ6A At?ANd3ˎT-XK·nh3\fx"~O,8I(Pd1pZN2DfF.CЌNUU"t4te$rTF A)^!hOirj*3J *2-=8@fC)T1EE8%2T2iLZA:I#1 wTQx}Y3}h#6ߏ H gV,߆G2hΣȉf00( ls{*ʄA CB݅`tfk(cS>:Jib#fIV8N=A ƌ#Eh(<XLX ,J*h 0; QXu ),8Bn-TYw]2NlO JL~5SF3$7XO(@e@,L%Xt<7E0oTJ~S0)HcB iXF`h4g#NP%[E\}3>9BMq-T!8uDrdG4J VCb9-^ȃˆF UeC󫂝DL X QY"\$-kx nk;Gb?}E&$*n((zEFbؘE$b!D6dX%Un- e"OU,E#j!=95 Hx̛RR qxᙅ++ W!a S&6(x#WLXOm-)C 4 |- 0mRhŹS5[d)5!Q %BH/,86XVgib/ZpJc0.m-Zp.ǂBw$q"52k e!F/2kȂ>SonĬV+L_lȥuVMFm6M\]+bxuX,0kx^ft NUNTF򺓊Xfq+&!4Jf>>pU5c4!bE*4&nG* &b&XE2Q$[=R_T$3,Dt6H0B߹'dZ 24F$g՜qkY(Ww{E>HS$ų){75H\bm`):JbT(X. nj"@HVcpD~Jk|hNlz򄋖E؆mA`Tu*;<4ũZ7,hNr@>15jy=8)DJXd!$^(x{jRŲ极YpS\o,׊mrt~ $h8VDi**dauÛrDL׏D9B: 4_ ToN5Im\Lg'\y& gM_>`q+销7*_ SxkVJ <אqu4b+P{ئrN)dͪouof54d_Z0@IB3~2yTҬ޽HSӼanc@9^~Iu4A7o| ٚ](1[s=ݳ3v>`Ƥh{E>!n!r#n[QJ)ֿMcRsH9tʕ)eG)]Bx3l?.6BKyX|u,-&S74QJ !}҉o)/ .휑4t4`&cqwk.ûRHUJ,B*_ɅH.?(S^BuoOH&ݥUOWZ%1ѓJTS{mR m>b_n">" ?hWvEC:L,_xwl=Hza‰7&-[}u_҅v^yE ZG4E7 lܷ GJ3-ez>>>?F"77%Y*] .Gw#ιXLFy&U_Mb{lB8te7(g9ʊ;]!QXJ}Ri7vyϟ*dKn~ޟ[ 3T+UFkwKιb_,-Iֻc5߀ȫ;Ϥ&bp+)|5=v mC65~~S̚[W4a3/fl.@}9\ nꮦmÎSu5&vWm1NNSЩA o5uZto)j(drRgb"eRJ[}u/Q RM&kk,'N[[KKtT~|7odI%#C)ortRs,(94_**s=O5>ku]!Cܛc"g`/|G&$W nH(-2x`դu: մbgs30a.dt96*.fr{cf]gdznnv~dOc;:7bd!f< |d}?יݗYo([=&+d9VfZ!/|)2Wc("4*]RybPix>v=M` e`^P&=y!` wğ'BDw2e.sYᯠDnbh2Zx1AHr TCm/l}UR`ړzֱ?v* 9+ŖcbݏW h'"\ax URCT>W+qt&1j[G)gA(΂$KN`Xn[:f+/ L6^ y%J@MR:ltھ`K]lg`e4sGmP"vMԍ4`w'X| ^wf<"j!VlHWkw#q:zCn3P+(n*7or)(?(%g1|EIt=J-G](HL#Y[e{[5UȚwqg4(Aˆ \bD B2 ^¡PUPSqE$X{+!:pg$=3Xh& J;"eC.JK²;&;]I&z\@)^\\.增ċV/e%pI~Li7XݱdS<1{{Jw`]u[T'׬|GŧJ7`fS1uZ|I.c35[HHS Vٱ/(Zԥ~ MW'3"~C*3Ã/XP_nF5ޑ`‰:|?bZc-P[bǭ.+n3ءzT'{iS)TS]=|g?>w2eU/w/& Pׄtmgm?A2ׁoylTpۨ< <%-\ 6ÆK= [ 0n!-5wpNqp")S/6Kjd㙇۾a!XHN搾|nꪚU*F \= t6$P$:LcBXsM]>o]N>pGTkuH\t)qE1"9kuacLA`'OCA""wNӇ>TĹm-6:f{&X52^A!M{/iD!;Su/78ON7 `:{ fT{x~ǣ~{Ǥ*w_=Wᗯ֋WC˓/OBj+i1Sߒd _E~@^2Vҽ2WH(!2 "LMCPE-#Vdo\' SA F\DP(p(-q.p C)ScX+ma܊ Nkg@.@#N<;m?w3VL@ݣk ·MQ,TGJ4\uj)#Z/%mVЊ!;m߂`Ց]󢔀ĻjRvުt?x3*falѴϻ1B/m jAr iibpO}g:}&8b!wSA`3RJP+Hl`C<>}ɷ;>Pt>\/7]:R#"DFЄŜ`ž W$7T$ eFS\=xco!B%&123!KxEFR9:ܩXd%hLE$ 7WqYh!!1QX2lTpDw%K4Yxob2'%8k"iLC16 S1SG(-;jI&85ϼRXwd>F.4h"c4$"2ViuК"deDĚV(,P!Q&Xb5&*Ø2kZZ3*׾Ң<{/ӿF ?m`*mH]C{3J(E*.uTkpORu\ R#%Rp4vcg3pHc̩NcLt;xl>{D>qnK9}Z?kakCM.,+sqT(Z>\Ξ21M?k@W5(7ovoi蚅M}wk.;@ٻ6$eq7Rt@`{"@鱉P$߯z"΃C[p8Mιwc؄ _4`|_pM\&ʵC bU0IwZ7vZ}{Z=j I5ⴓ|}IG+~4!O=bV{0㛘4d[W1_.m]]^g @Bwc ?CV~~Q.wI.Q.~GYsZTepYAy7SʛFy\mYwjp۵qХTL 'PT>XSIr~sQc?/r&*Ϥ Xm[py__! relcAhg`rݟ$+}`RLI̤< "ӔbGG#b :Crjϸc&BE|M>soѿoB6}+ʍPAy-"Cl*!e?si퇯c~[%w+ 1mSl  B)Dpf༠U88kpӝQJ `n7Zc%^ΟԜkN>CB13 Þh'bf"MF "P&N1NDFuT7혂oT: M_%i8 sUױLNLr}Wn02&kl2uRZBmx)D)Vi"B#k9uqsTd Dǂ""2%Fj 3[\û|㉰HJw#FvuĨ*"+*CY.TۭܛӒu 2Ey-6F;qoQGMb]6z)>(%yO˺mR5{d@5Q܆n#ۦ{x!el|"_o¹$?kyMRA}dÑ@IQK<~˳%ȪWJu!f0\ ېs'.2d-CdmJQGFIƺg!ݪ/oO!L<+̊f|DO4Ұ~rl-JeVZ"Et;SYTsm[.$N t u24L3M0 5:<5#N3sl":/$gDig3YEu &nC#>)([gš[L lQxF!^x{'q/UN9s,QbyΟyUԣ9~҈#R\ݹT Zqȏ(/|>K;>}AsLD@r@lhKPl!hJ:{" 1HSE)ib343T >q3#;! ='p-X̤ k#lTԸ2 0isB rCbxpeoEt<2$h~Y(00R5|É[V Gk/yg>"f. YF0a[6:R-1YL ׁ@li՟7:VM' SZ# mw'k*H#y#p-VQQ?mTMN1Ȩ*8n՟$ ]$橤rrwE aUt&z u%h7Z|C]"_{7ݨURH^RG.xKKu@eOo<F+zq~Ɍ@Bd)Mu>ardON08kRyM͍KSL"|j0 k1MRk +k%0KPQAsGj0:Kxw@+HW`6(i6e;M2UuSN-2bl,/7G xF3}1ML:7#oB x4^d~ $:U̚Cpwub&7?&LQ7gyy]Nv;K!iP%wS1f> !^e.,<:#0tf/Wn&ud(E6 gKaАza!JSaq8$d*K(Ky3pJe=:)u5XNaod`o9v(j*FX>&uC{7EeD8R(ZaLLq*J6x)DP YEf`\"jBJF{V*b+Ozir 9@{q+t%1arw5Y/)ﮎo8VA`.%+<`|g}_=HKW ߽ 3#޾~~U&4H0"dhW ~ Nㄻ+'?t9jw`}(L'P-<#%e:A.~}[n a=av3XY ~XF'VS8ƴʧ"[[=S^9Q"pgKd K,p!2! c4HmeZ cݏYӤF4c7MiϦV ¢9d2y9z1=HrկMo* ow= _XN&ݟ$N} ;aa'쾼6 ^w$bxGcG>Db`x0|3&_j cHe7 l8@4( ?H* 0q *2jh26a20 >W@P . q ^IyGV`?fSx={j j"Up8 ǚ^?(}x97JP Rpћbwg}x]RQMeBߊGW9{2)j3FT`럢}ڋPY6qR\6S wu^]EʮBv:Ǹ3&/z~ɎgZO=Jw&g%j dzil}Og\QJ}TI#(P.}V?Hw{=D3)#VQ1w[ E] rpdG|UUVys >*M8|՝+pM?k4t  %iܧ9 -**$}#*]*aZ  "R%^ዷƍsv'&8 Mo[ݴHګ%-eɲ{ui(Y#?}͊>]gg ld\BQa58*?/APiv#ӿ%\A 5_)o0P"ylGCL_j,T'.&+U{L@3䚐ٻ6$W}[n}H|{EreFZɔBRrr(y$R3"2LtWWUWW"q]-}/!J*Ax"{ Vi[qmD":I3mx\;A~ \cAϮwQ\To:{D~V/&z$>ZBA} ZZeW*xvGT\WLeYbE:9?/ŵbtU2jF 8 j4<7p FjW熳 hPci W*$nxx7I18s>S!g_ŲeDɶtYtʔQzȈNޖ %«-ϸwxJQsx.4\ZZy퍷3֔0.RKq"mM/"ԛE1zEw t02g EVv_L#+wWM=~EFV>B0վjMN&!)5ob'q8q8t{mh ~^QQ|k 7LVxrvBJJ{&;F+2m]FН\\ͯWϊq}I8{.i Q]%!ye*P.喁gv~vy֏>}Vrwg4Ww=. PhC"JpN `)@AR%aKe˨{+$d RX)QK+FO!j5P+crѷLk0}xcWYT"-9I>H#5a!in)dat!9 ! uOʱST~[Z iF8È B\_L#U2M4MQkojG[c鎚6\#qsop]jĕ;IeR*uJnjJ$FF[T5l5ƤG9hbpK,Zݢzj XhLR⚑L1'_{5zݷ"k!5Z(XS¶ΫS/cņ# Z5؍CR E"*M^cPChGK[Xjtj9mД܂棾?~m84V~; PtТ,SBҴP1;l )4iIwn (鶕KI0g:Z/jZoٿB} 1i&p mg@mMs/Kikݻ{43Շu;j7ngd?;&LR bVYX690qbN_=V%C  }"Wx1Z2AF [1*L!OJ=z H0hVKeFP6GE?>?ZZte}7&,Veν"$eմ1е$xM^e7A>}LP٫] L*U>sTl~Bh#׫9WIAJ !UWn3 &63_ ~!X˼m(7z%u3%ּs3Qaa>^h%/SW:S +}@,<RT~ob>)yFyvq\NT1fQENѡi (hyӴф"5[gŌ{ԷQBe=9=naf~[$ڦ ch2}qSr D24]f`gRx|H '~k ӟ90`"UMW 1?Oo>}jz{cJo:y*Bp%0K ypIQ{ORԡfd(ZyL }<?IQriW w Uoy}5{ED-ൗlc̈́:BiZj.[\PŌC՝(| QFtu^*\L֜'@Ika,mz't4cMcV,XצastÜf**ըBo-NFJ+K,dpt'<$ 'LOacˎV҈ZߤHSm4OPEZN1h31U@M4r% *pl'Ȑc /Nݖ{E[b .RZdI+L‡ӟ~"mB2T`u1Q R*UffV1nCti-뻮%1.Y9N qryxvFq;.v,ŹpfogW/fg1C,?Ծpa/OMVs~f_)ny''Y~h~w >V." Mo*|dS>2)_/*ekA" PSw7DpJEFŐBi{ʘ 5ޟR'˛ !Xy<~X\y(sԈ}U!i &Zᕋ))gO.*XM;wqs}r5>_="[;Bd65Z8*+˼|Dh.?;[?,Y]kXikؿ):tj 1AYA Chc`ɠCcqj[ʔ>sq'L>k))*X11"Zf]'T'KM{0XћHD/Ƿh$ =^,40|< {-[A͔{6'NЈ@ #!S )DL&B16#U`?s)3G82LqNU4J$ i՛$p=nP|4?h \]HJ;W"r0*g&?&M6nE&A-FMIx&*ꄱ*dgFBpji"ċtDKjI46kqN'iR08EdAzKű|Vmʵw`G&zߢojS5 ~&ˬ{"E>bk$K3-FV8cXܶE{O{%m$Bkť[NڢL =1Q᠓S@i,o:c9H,:\å[qQO"J* #,"ڡf6HWh],pWq_ .U-mqGY\&~wԄ^o~eӤ(NT4X^CiYd%ji%p%J'*f!F10,zi^Z)9Z.:0-FPז%'@%X`"+G1o #qQcagv,Sm6 -Y_B0f[I 0-ˉF&tv)7O뤁Oi9\]'ٻdWd}6T~ ذ,f9ֱF48[M8)5ERlÆdեfIDf7?vkrs0Ud/{{"LE7]$Ǔb{ û]O3OL1@Cy8 f# ?;=al;mK7%L205EjKmAZ="=%3z%¥R %b:LHT`$E%ݐzz"mDS$|)Liei[[4AaHn~۔mḇ&9U(s**uچJ(2Lj&9$cZa&KTq 1?~k0Hqޔ &Q,s]x0𾙊^+9sB7(U0ZC`\򎓆\xAa]~ 6.;πzuX||qT>B2{gg*V9Gm[4XtIkbBwRy)f"Ms* b)5JrO梉UNPFmJђlք+"F4gG4Nuб2ҧ(bK0u 38dX3oc#D b3UONG CX^G Qsl-XcNb*TlSIZ*,6g2`s*k˂g0)TdOk DtFx?}7YLEjeBVwv̻erBaDі] tmqPc)6I0ɱj ǢAבlr 2.5(t*3,9( 1acEF;jAOcC[V~Dь*vtDV?ɦ8 ,x"i +Axca )>'P [Ȋ- 3dR,;]d B˘s.<"?ż 74*z29kB??yvH|Z k9!I!'=MXϊ?YIAOg8KNtDe֥纽K9Y&ݦf\vnsD8r٦ٟ+0y79?lgs}ȅeWյ9hS4[Ya7HoJq .`[gtVn$ T%]]Dv.gzCqJv } ℒiN\:*ﻈ R%-J?ݎFLW7w} ݰ\Vքj@v9.;!P[;խO7 e-23Tk̸,VVm-^V*0B#$hk3e). X٦K~8WHk)2i+}\&6Tc`W'R^N^Д릷d.Vka/@gC M_VAÙ󭺛x;aWg=\BE4ħ;aB1^e'be.\V,CT&nјH=tXAeg< *_0s8HM;wn:YQSV6_ַXw=[#t?y՛9lZ9g-`׿wL4U^CWX oj^?HY6SXRta94`3dIZS BE@m޴j;ܻUgYGL~֪mհ? +w]J j*)7ZN;G~wmO{<i7L(Xn`EK`f-ߪP.IN3ϚbaOka0nz4k/f&J[CPJp$.i-FA $^#nl J4iv `\2afVcŒ}91Jj&"rCX!ZI4o3돳,"tޮ:t?R?qoF{j@3ny6_kQ ۽ kYFOI3U"W!u'hp-T"V:;HeX'AZ }šR,T1 p6GD9tj[\έ yx"ЩbDsሤ jA1e[r~| ʧ%(\?;:Aّ`6Dřټd=y72& <感֪w:K[T @ ozѬC|ɼ::ZCgH(( \b݈[mH.fN h#Ćj5FdHT𭘓B"p\A&܀7$RC8HaʏT@qwW8/8kuT<7d2fAu$0@֣ى$Aj-hƑ?ޑu.8ߵYfi&!Onթ5X7OB Kqʐk[j# NTi{8ܲ' 8 6A8Ȅv'}Oze}7T˴+3t{83}os0*on<{gs6 ;H?/Vt^p22zwǻpuO*;~'#Y6GkRVvp4<'ƼV{""KEծwWAVf'{YTI}zL2sk3iz0UpE7/png`&sè'5GP-! o4pc+k 90$I~6xE*(-wmI_!e"bqF|~I C$HNnI>a DZaO!KN&_"5Q_u`pKj)\I؊[Idf,[LQlWW.t܏g+>~Vٳ;~rMHNQ K);Qn b ji=u*4(swP):t-*>0K EBjPDPvU] Bwנc9g-gZzbK9=dx!RSHSpDR 7b(RXHzL&O7*$'U8%K'5>pnaj)& |("0NG3BG`x[Υ븗.r*:>',B^`S@S)TF)dʖ`->&_[d@6[nDVAvz=㉯@ՂpjҔ/-`%ĘH^PЅ+v"LJeT>͡V5<Z}YBvfЁN,׆}3Jf)(Զ.k=tm^Z]DbA˔^dX+"C-;5C-k)m )NY;\Q"b^ѬT1]arΙrpTv-yU;gӋ^s6^qhٓԵh|AC,d>PeYsxz #td;9bp~IܑPB>pr>|BKPh7ɬjZxT*ņp{/# Pf:]ͳO@6+x) V2̢\沸Q|r76CdXTkvY:\t~22??ioւɜR [J-=P*p9p2_Ǹ)Eǟ/A4qjg߽Ee9%^<8 Z\>qt&l,s6 nӻ晉I:.Ԥd~h-|"InACl1dJ)(2hΫ&+$ ے;E}w-1a59PJ =l*fIxY13Se\$`H9.R-̐dl/df+02Jp[9߸͡Qj!X-ws3tĚ VWq6 ^8b.#LsЖ82>>WCo/췡M^ {9b-WUUzxUGQnܝ}B8rO*Bb}͉+O5G.z_@~ 6}J-Mtz0m5L]Y㎑(.1B\6Q6ȉ q&dqo'm"KGv¶*5k;Fd-B`ƏQ,\zitn~3o|?T]&_ uBN l/)|i7^V+Ό"v+u5BlYR_y^xIXsYq݌j0.4Sc7'$Q;XKKNI'bnSQ!gs-`6g/j^,qEd* ].g"[mv6<' JOmswnF/@4G lJ@Tp$bxDi:MnDž,5/uX_1Ze-˝ߟVsl .|1YqXyV~~*Ps \5W@U@r؀CpACF" Xc78QBYO gCО`uZ+$2|#ai2ϻu;$E_À!6?q@a6`v7~|D4jIc1rk%ѮM>u Aݎۑy_`*xh;ZN1-PrWw5nBYkp8\U)S.B!P[K ҸOЦ|mC9;JBu̪$PXG@E{:az=㉟=3w}fP'MO/ǤD$v@.lփ@R%S@ *RxDv"g *h/&ډ1DQ. [/ R =EI\)\wo1Piz=hBVl2GO/9Զhé]q jQj4΃h +|a)s%CҖZԟqr) Pf)TFà\RxgiGcAX^38({x8D ʗ &$M8PfϩƴWwAћշB[Ov?EB7ELjyQΕxsn'dkܣ)'5)SLF!tv/@X1MĊizŤ*u`aV%,x/VXN$/p(;E& EVgPU92"xRԖw.bv/Z+vyv.nm=ڹɹmNN_,|nM-9Ӟ_5h,Iw.Ͼ?j0;kM8X-g:})kݐH* D6sf2Q4p7q{p0y~z{h}@ا-yhv|=%RSj(e*O&MFS?kM`jQM$/6`'r1H勉nYJmop[\E4K(V4)U:v EtrQGprљv˞hvkBB^/S0VUIçLJ5Y{S33nL`*zre[INa\a<,n0MF'Q5wϙ 5 h?3[w_%s*ˉY<<8mVSk}rFj>]vC WrrNMhktflUkzPERW3ƯFV,T̥aWdg&@Ȩ̞s `gg"cJ'dL}w@`2(و ,lh|Ѝz@x n<1#+|t9&ŹqRåWrky\@Rmp'!l"A@"哇H #S<8 %/'nnK|ݣ J)]8ޠRDR)Hza|gmt} x_a^v{./-K`Dgyarn엻9鏥žN xc(ۿ^~9jl)Xzyr iպdm1  ) "Li⦗ ByeewX_ ˽Cq"+X14x3A51X1{]hcy0"Zs5^3֟}2suKtwR\WNn[UXHdn@2|'-g`N "B|DZļ<= $" 6CN6]’o`U a<]|%K p1( |E7EE ^J,2`v[ Bq-$J}khsmb HI5j=RR֔<+'DsT4!yHQ.Αx9*+kUN3#p&Hgy`X+{τMEVݐzt<\U=A Sė4U.gYfj0#F]&Aw2/~;\- Т`ymDL&K=fZqC$(˔1OM+$"yikTJSR,>eLzt2AMOOjG)υqb l<$oG^:eaC!Jp&ŰwL44dQ2:Do vPիŻkAfeSs z^Rk *>Ra-`Q֊Geo\/y Pj|uuntCqZTsqT7;W[Tx nqjUhR֛.L&f7\TH/|SbOY.de=fxٜl+P媹[Y1߽"vD,ㅎ2!q+Z掔^7-&Etmjtmc7u-ĎD+Ea:jjrtߏ&2M_q`-MB.o: vXq:^ҟn+2SB`. KSRg\* &xO^0)ΘCKY# Ma[4bLǽsK&I&;FG#-Rx SE9\6[*_?8 : W-6F&qB;ccY&=l2ok(4kR8 2CM D6u G5fyEF'C[VT^-|ՆʓZb}^r zŭՀAV#{5@VQͭmkM5F!XRl3Q,֦J&u\דּnt_0c1c B=] ?q*_"`upa_; bȋEnHQ} :Nmbf}GF`eHW9PX!րhw+@x0uo'5ȰJf{?F]{3|0l9{Iw%vvPi!Tc^+8x[XWR T3y 0FUZPW¸a֣g ] u =y7=ofl:-bO'=OHĆv|yCv0tx^\ SotO/{MQP&, ~旤Dk Z+\ 3 ??|l92Ln_DXm&}+$+2QA[ڍR 3jĈNwn4l9dڭy]vkCBr=^4@ P8PBYjx^f{O :-)bݢDdN1Q1&g= -YeP{@6&#'c,.֠S(\a2oS+˰̟} `Ohog Lnc򂒝oXŌP:܋{|pܚ%|"^iE^ \ҽö!?\Q`Җ{sZw9%)>䎺хK.oM/zbiFgo]gӋ\Tñgm(/Ft_%?w={rn"b/WxK]"/~|bD SLsis0cziisdc'g f(y33ot慲D5؅ܭP-tlw:}ٮ4-psS4osF" I͝4\ >.]u_+*ZטPuGeS$ 4˴t3sNd^jb`4[Aǥ6|< Noy7(_Yk֖jP54 ʓvʳgtPj2ͼF"3ylhu[}O:CppS꒧׸f6EGt@c/S$VZJ+ ݻ>.f*.Q0ytiIxfc2|~|b5~IV |l`+ߕmrͰ8l+Jl^:CB\?Lq"G?|y9Zі?dCRN|(+he3017k Vӆe1i%K8lS[ⲡxl60Нm9mNF(y1s./)RکĐf(5]:z*oċ=j|Y*H]RenPvp/eUAh㩜)t~4?xl;EaZ;hSj(̆.2l"3x!-okEaqƓ'ono6HL6vf>~ QSx:l&X6!SI)ʋBNǓ*crDwo^pAZӴ9$*(Y+Hٗ NC"]?nO H+%*PO)׎茉[=(u7VCf6V;`EoNAXĿJ'ZHڿҎzo7;*oo<]YH+_YV}0/,v6iyѥiT| o$W$eK,2}ql6Ieo=l'‡\z'nX#1:3r :XcЇe`jˎw}Q/FW1J RZ~;e_IF 1BΉN*[(ZtB 3;iA>Mn (Oi]X!zZ8͊:aY1頼SޓϜo2p28cyx'Dd'^xOCNAI]f%%T8]ZLJqIJ(HYَF ݭ)҄ tcP0(X)GBAcK\?X!{Of4J%\TEpAQiG =]{5=}_6g\)PYj#DTw\Py.2pIcx *>;9yPu_À!~Pj-pnk1W6s:y BsyȰh.9g>+h Œ˃|;bP܉܃<۴R8D' L uĸ䅴Z-Ď;'Ԓ"}.x*BVo+mC)2NѪ&?1d%A/_g8.w]׺\N^Q. Vs*);ޖB ϗuĮ^ޮ(FÚǎPa4P X7Y܄CW}ƾÏ QRl0p݄l=1o~cn LXF0BGP@f%)EW82FKDo6+4 az馶QOm^x󨢶 _fBI_j+(ȿJ"l,)B cTYX>v=;6ndxźrl[?,7k W`lR1F33f >d~<4|Z|*kF;7Y50[b@b'28;nԖ\;4jk{ OR$0K e28FA39Bc7Pk2Xƥ5;abYa3R!UfPb ) RikYem0Or1cMw1o>[M˰^7b>?l4_]$o篿d~rza/Ջe Djo/߽f~z9U| /!;sq.wf,S> GQ1G1!W+98R#!{R8d3[eu}a> y)ɶЉ)~ V#aOOfaGk$`Or[FoƫD{)aӻUQx. ]I([ bjXT ~U97|h?m<3)l{$6xKXa)<&Ҕ&ޣ{zWYn: ͢``-&AM tD%,s,} J Q_J?\NQbPs_ V !QF-%B- *}"-*X$0k0`?h-tc2$ `& (e0ӊȢy)"XԕVVZWX;sTw7K~q{8+#lrQN:f\{vg9m"&"wge?^tJxeUy볂 vv,0_yv2vFPʞet<>KL4$!KH`%*㼈O2\iԳX9LySX੔ZgI-4مM*+6Jnx{`XeÊ=AESu=BJ9|\U1 !"/. JL02l.'CS#oQPV2.$F$>?3uiF'Vq ֜V󌿮r N!*WJIQqQ_.-8!kХKt ܝ" Qg{+"T͸f 4_śP.ѹ6\rX]i[)%FYNspͱ$uYq8)5XRT0eH : `JjR [ڠ5/o]ZCS~B_\NtML!%>S*;P"rv Bk""۶`4A⾂\+ EP)C!Jd -.B9QҶk9]5ʷ3Ji{noח`34RyF@ު1gk3CIwktXUSZ'c֜t@5C^G][ƹB=|o8YwqLH&PΌ#N:0c}[,؎8|=ìa#Irۆl+-:J-nnQԇ8_NV%W >.%}ImM F\xQ&ՅU Lhʄ,v+hn#9dTF]ʓ(OvLڠ u?Iq{ʠP%$բYa eSӞ *ˑhF;7YU]y Ҹ߼ L+TECW+0̢hؘbrqq AmgV*ew+5bp%GJ:0$ RYFH{-;M^q =5fjCW&:ҤOJ-@L z6CSՔڸ^e-NrAms3rγAl$%!%J}fjOW|6Nݽ9ʆ}~Y"-yFoLxOsĖjxӌ UKsio =?EamwhP!ᐮy}{T+2$]f N]CZs6aKzO{|>*.…y͔έc2ow Zn$[qHE{cl. "p{?h0X.ёexi\f6lu쓧mR%ox4yz}u, j윦`'eD޹Ԙ}xxkN#6jfh‰F9ω INLXuEC2y˧2Oc kID~W-ONض0cW(]׾fg^00 ʐ,RJ]x}c2%/TLҒxў I2IhppN9-0JB +xX`IiwӷnCSyg;L'5YD=a9 g$Db,L ;Ǡ/T "KTiR R k3 'CLjRڱDd0,% DN{,K5ZmVB~-c O$).BS @)hOtD2wނ਒BI-Q&9(q`a"3 f8e< ͠+ .*QTSIO $o6VYyڠ g*t/C*6X*$]x0XDO qDŬ6%yoBDeP–i1$ F#"Rw1~mEHgښ۸_ae65Weϼd+df)'8S{@RTjJFwm,[s3}G9=+'ea/e995A U6ޑw> ;b9cЫoS?^s}w~B"f@"5?ޞLt6_WWӛMѧSB}@}3O&WfA\CLE}%\š'5?š,B@'ϣTєdt}ކ209 $ M!4\ho}.>95St[U|!H%5mk*hC mL+nz2a&vʼnO%wF]It $WLᆭʇw'P @mVG$Zo1̀b@#xQ5$X@c>n]b+IG_\aeF<*\%w@U>"X"&oeDLm5 \U0Eز/c}o`ϟC>ZI!vp"nVB!m y=eZfĠ.\2'!HG","C~Bל#07ieڃEgAm:(g {$c$sZ1^EOnnBB ZϲUlTk:p~\::BR9DC "RgY?J}B#1ad@b1d%ɲё@&0#Prr#!(.HSBZiYS^g3vu4@hHj-AO"+B3\a]krXHkBGDm<1<^ňw>lsИM{vO7Y,*y31 =a) 'qBGT=OeAPxW#ޡQ @p!nS1:4Ο5cӱ@}55kD _t) /%t_,/77`h(X&BvVt05 6Jjo[u8Tm1%o\Az~jvAfA` <`4./q_om}ڙvcm8W'WWwn.}Iwbߠɒgi V6!,{m]EyJPpQ~ۊ"è6O1т!5"PtJ!L@4 {U&F(Gpl򯴞ؿAn}w֙IL!1Ome|dqj/4]\N٘[l$6vObVrfM ZCVƥ]Y[H!v[w{xN-J ^!l<˕ OgN"IehqgvE[%E[%E[%E[mN |FQAOpj,"eiEʀW1gR z~5W؋SZ mB)0ʨȪW[Q!ΝU JX/)8mcz;ACqjG4B*j,Qs(6EiūE qd Ǟ-)l͢@/{I6c' "fk*Z+f5k]HGO95&u 98 95 PbF2@Q;*kz亗*V&VXEXG+Y/XA Ԅ S`NoAz1zM|1ʼnHVjy Aɑ@aьm0&+ @ IeVLh( :BZ䅈((wf(Px1"5DZj-0 N%M!\Xv k@U`lh !9DY^dt0wm+) 2`: q,F@nl11hc3\cBI9x,"EP)b\^UwJ*ES1l@+jjP|!i#17`!u.9FB& `D f P`K,nir%ҌMÚ[WVS\-L|& ɬR8Z 1O JSx?U*^ñC998XR5<&^ZM+.D3/CGK @ dg, ȱּ$Xw^"J+BG}?-@>14ME7oII^`߉'}{wfqy~rήl곴Ҭ"gw,^<-U-> /2L -s4)q03q-5/F픺ĘȕT + ASVtZCDKIc^o[Q B0t`k1t_ORJ3lU(T0ƒ7Μ!:)7^ y4Uu~X)Uaߩ[i 3i%KYA63'a&~imf=IJ+ߔ ~Q`' SRz1G!EA)dP*fA ᭰T#gd =ĆoN&˯'F7f!|yTs9K.(remYiȀ!#7G,(2/]Rܤ_zDq,lWJѧv-aKpHI q+"+SM+%h~M ą;SQN^}k lў-m^Z?T4Xa@wZcS`'S Y {`YI r@\C_6CX;RZ/2{bMR䞶-∓@ .I^/o( .Y*j@6t;2A^wr>^7I[}{q YhyC}*jb^Ȳ\w2X[w2 ke!y Wrh!{?8ATaR}pۃidٺ "?$nn+>Lo҇2Lwإ+Y5|uBJ)'Or,pzP/^]Dmq' ZXJ7Y" ]2YQ|Q ;ri+wgf"Y!%~&T+&oFIV8OL\Mn=b/q>ߥ&OA$Pu+{?k"]taR΅&>?yAi)('(TyńRV8FHxEW@5eN M-8#dlGlG;NrJ[,C1t-GG +#SPrjEV0Hū+X\` +ZY^Hp֚CptJGj-_L׵`P(1s9,v hWHDThl)nQ]\k,_!ޫA@+T0=cSnъ.C#kwOa`8p C29Z04a:L94Uj=xc7lƃKՁY xdIN4؟i^Z! %a=d„;)[Gqne'W[WvVf >j1N5 *îI5InZhyKX }HH:ECB= $] D`47_YW#}!R+2}s}S\urFA=Ϧje4BQFV6EWGW8Eij1Eo~eSt5zyhj?; r7*7΃Y-4Ui%9@s5mWʅz~8W}pv~]j;6IgԓY !'|y~yL1f136GȿTƿ_ش=8ffH w/LjcG /Nu\~xQ]OgyỳO=8X+%|}?Vdl'c~c*Cd3JbIz8a68ZΨuVv=9($}Ш(GՓp9\߽YO6ss"o}q,_AiD#vVG'`4 '_@) ҲalsT\ 땄g9#nuNg jY4JFܪ}T+e&p/nj:/? ֵK݋pIXד'sBK$=F'ks`Eq\)M?ԀvkrJ|7drי5Y6=hKӻG9[gcK9a65/&/o,mG ?Aw=.f0=pE%u>vze'm)LQ4=iJ*X,Vð}A& FmٌSƗksMG cȮ9r7W$3=!]b"aDtJ{RD<(kyRptZI&kqRp@]muk ۸Zl\|A\|;ɿ@.>׸'mbh(?^B8a9E8 3>o54Be]85PvGYxop"zt 'DΰJ;"vܧ:v 'ƺɫd[ZrY9ƓHpBI:{s=FMbHނGR89F*q$;uѽd@p9ѾMtT ʨ,ŒsYy eD;,4嚨#aE!0vt+DddM3cVN0E[FФljH8״Hp C"XBPYsEM@%HRDP-S hOgºg)TzJ/߆Pj/ítΝ'w :g~~8jYviPN<p#'_|'ϱo2^[on1+|q궸CAs9VHɲLsP7 }S*R!e4[ǰYofN7,{=m= >zMqg1u i! {i,[^/E;oe7^SU(/t+7g$Ǎjk_n u-K>SB)y"wmHߨUT%=<Ն [T u CwDy{λ-/c|ߪ: .wm `pwAA~7&@fu.Gm(x'9*h簾a1lik <р$L K?^Ź%0tn z_׊DBv2,z<2TOEԒ:G/u!/{nKDU 5(=[ \ʣTtZݸյ{@R[-P7[A7|cĐsw BP=UKVъ\NVUq/ѽn=]hQIHKGG菑zK`RQifbXGEόxR(QMB\}7 i-B$4B/Hiqѝ7Hk(F| (wn+j $挰c+B +Z1kɜDHWtrN`(+Pġ%n*HЎz/ 9fsY㴰VY-ʬa*Y<HK.2'V,bD=FBBP5e{ #j2RPBRe4ϩU(˱9ȥˉժ0i!e 14ڇ ZYV ;:@'~%PaNE/@yG{h*5,'xXkaQ535]ESb{㙕pV YI05'$ܹ!Y&bݵ!G/Twg! PpT qY7Y4T_$ķ|`DiQW3n>/3ut F"2$z51[?/^E" }mq@K'bJK" XKu'@ɱ77Ab%#p [\M7beҦѡn.Z)O/ "^!*`BM f\@B+`f ZRCtTs0{h~8W{5p@'}ᇻ 65|0MÉZ8߲VV(Ga孪0ߋb(1آ30FqM-vG.ׂgbݛ6X$'O@sTF="yRI樶9WՊeȲ\4ZAHdY#OݻLk5,cZƒq l2[|8jYvV |xQg&ԯN,K-?z wY{X}Oԗ̊}*ȅf\?o7g$\Q\G h.RVҭ+[Pɟ J kRRyФe)?_g=u$W2?&K\dK,q%.7;Lc9Py5U$:#m0.34X*" ̑49K _?L>/ [i0RL:Е_]V7'kWdG|G>H s(PHL֨@McY 4Eay !1)2 UH@6:Ijt2w(u6!urcLj!qFXltU ;sr6B: Y&V`ax$Dӓc(q iyFHq&}by- y S&%rA %$7gQhQtGFFmhT.Kd'Se !@sjP" !s!BFSF}, 1ɬ⤠%Py *"u G㞝0Xprn uT)[d3z?} m2,f g ]RO>ϟ=MA?:lAdZ ,ra9X26{,WΖm{h#e-!\l6#/dmx1hBPį'H H1;1wh'-T JsNP%LŠW=CovA. &Zn 'U|mAż'*?܇8D{57MsWj:noKK:'e_ɸEfWS`T>eƪA9"#$a0{/Ƥź9ˬ+8)3%L32GVe3/$|gF u^ebRzz~wՀ6aKSE)wi!%gy.#XԺ@!5啾S1ZK#0ɾ-D{1n #٧QdG.ϟBd$,50ウ/1i=@s%\v3ͼ?HJ ]Xl;+xOtQ)|{~+D$u $=gѳ d`(cë›~uLʾwl̔L4*n,^`xGC,ԒHS;Z҃56CP *PuNcnaފPN^/=jDА`hȓp}p+lfɰ]K!Xwd/KrHMu;f4QEG$ 1t"VԺE92캷iP9;#$v_1J1Ih!Qw25|!DIz٧%T'-Fp?p_dp-{%hJhD6fRH.'b#1Eo #޵VNZ-ZBP-Po{Wh ChI_\r-1gPM3k885T` s+,BY2;`X7fE'~q7w_]/I')8~2!@}T+҄xM=ȥ ֎v* gsVSGt +Zfs%"b=,njhV/)fE'(ѭ})GZO#DOsF׺,r IXh\y-h@Ara+"3Ksn287VځƏPQ4]t0yvq۰e2]d 0:#txURL 䩤(/E8״4EkrsvማgN80`wjk?{㶑0/]}C`;Şq$ͦ#`n4Y{SMjFDJM6oɰg,"ꮮꮮD|p k͚KD1i׆#Tw}Cfй1Pp%XX #­C!݂ˌ^ Ⲋ{z(㸪:wﺒLBo,~\E7 plRm jm 'azçv5ی! N  EZIь(H%(jfPtq;*t#ի!Y lUBҧ4R*) ƚ)E 5`Lf2Gqy3q3nM97..o~B;y3wnM4?$h.Y4)Wwv.peo>#5赻yrνe̗$9\:۳d׉3>urwY0V;C"SMt0of݂K G3^]ӯMʷ|? eӈ':Z-)Km:c!Q$2:]S x u|n]&sK#K0߭3Z!8 CGƋ?4`BmoP blՇUr3bxtn/9Rĉ$a`Lb )`fbf@N#͹E¦LF&F1KIDC\_e~M-JyJ!r'oNy-RMrE1vBS"PRSO 9܆aF MDD"4)ZpDSJpQPĆcH 1" J]aRLQj( @ԃ(uFSX#Z B=B=3s~as2=Z MQ.[Tr+2 j}O.& PeXQLÔZFb %܂eohɑm׋X6nv/Ra;ܙ9WmrNHK껵^,1.hyUH#6.ebb5} C ,;#UwDKrt?׸HAI*S+U:e\iD!PbH:`X| ~Z%(G[ B;hMIޔ>`WJu{zHzգbq8lPg 0&hrwe/  54maN'^2}8i= K!8Tf&'Z=v\]ˬրpuȕ{|/~pn".HxshfZ?KMz~v@)bU>x;xt!>ʴXhy/t>i%tk cM .9F1#FˆXKik- adciZL1ce N`6E.6Gt-I :RID1T@K2tܩ/`4ZjJa@ck"l@X!pL)xw[g!462R䩄WHpEURw!hQJ5*uŕڝzZ k 9bW~Ѭ$bAWgx|ndb wO_m>_f7x'?!YEp @i1;?O?f\<*Wfiw;o;7rn;k~|띀eFL+F6ߞܾ;Ɯ0 3he>7}Bl Ob7T]GP@kr0%g 1[kI6U Ln4 N16vGH'%)sSfF$1"?qYmkH^z)5 82""GI"dIrLU޶YP&Ιʱfs*`XL p2,|Ճ>JJM#sm kE9*2˖h&[EH3y۠mXVrF.gWo3.PK*v ݼ8WvX)$"PLۖќ6HpXit?JAhAl^8#[h=,c$ |ZW%'bysUxG a>.d;vmucHrM!T ߍJ {c!@~ ƽxRv{|/.9˧f(YSz$ɞϧ=ij |g*ԧ89'+v͐ hTLnv?w`mG\\\f.SY.|ǴO{>(l]$rqgBx/Cgv؍z{(pHk*PgH˲w;~cׂnݼa>[7krdOϐ2Gof>v]VE1ɿystWG t&Q txi9Z`HDڇtׯaMg>>yU"Dj#\սk\C}Fe"_GmeW*UCJڮ4es +ՑCL2L` _H^ɥkDa+9.nʙWnsJ<mO*@[*ۻwIзGT)q(q\ӊGn4C^3: xAtWSpp88uz!v:ń0c Sh8 }oVt-/AY|EdF7o(Sh/AkՄ $N8!)2˜EL#B6m$VLT$ ia",v'ab/ IBɋLHr- gi)iwG{{}ƽ|RsI8j&uptNԑS''btD\{B¬b F*q?V0C,;[[<*.!^ۂ&kHwLMvpd0>6!¹dӊgTTۆ,jvn-UFҾ'!NSS5UOJCj^j9VtX tq56W?)h>q{'{tn~y:,Eu<()@|mȁ@ǎwx^Q/Q0PWr<9X>N/a3tE{ jBQ=ػ#&YSDhXӳ%"#gҥ4Ni,9a~ AFp6d]~9pa+P fjUI)hšSQ,ٰCd>w xM UfHˆP[*Fp1t\ͤS*L@69#)~PG8j`σ!DOͲ[ *̩ς5˄H1IH`28Cˌyd2L? %:NcBKK"Mxw,]OTyIVɝOc\;o'z\﨡#'4թLcD|u]̈NF7m]rQLBbnZ(1]'pC46esŚ// Zx]\ŗyt[l{u%@#ֳqs(mmݿBBV2Vh##{eț2w{e R4՘u^Gޕ=Q-ƤV`W܉5Tx8¦ؽ&uK=L =z'5wz`K)Nu0U˝R7ЈeҚa:+a}Vz g3Jll۔QJ(tт #b-U(%aev-Ta9XY!S}3, [Ƶ`>\j%= F +3@s1 G738!'@jjͮ)we_ZpR ڠu" A)lH@1zA:jҴ-%Jbp%a"<^IV8=LZӒ%mL+fRUČ2r|i?DJEER`!)BQ)cftXhN?aL\c)iy~BN KKڵ3qzdlEyIrx2={zuh _8B3Ưz!mुGFy_rhK6Z]N{2cH%FBS%Q' ܹAv/i{{6TYgtϪqm,B!AGCҖ+_wt-cT݆0kEYX"5c!z!QŪCѕ",6Xl b0]10d-3eib,Cc5B  j!Ϳ-пM &7*2'wQh Z'ly374˕M?tӵNn8]-O1J]YoH+ b`POӍAbeBnUfߗԑJ;цl+%# ?B\U\0Xrqև0u:PZycQ~XW}seAW{u}ُ Y}/r4[v1q`KywO1T)׫Klۯ]ˈd(,֮\ξ_nYBG JM9rş˻߽͕߳n]]uzłsZfܧȴ_'X䳋Y&"KLJ. Ce>_W-wom2.hKSvx 9ˉLQLpV6AKăf.PM]9MI/6G8XA^IkΎ,Wy=yٗjҴ^4<!FG'h:'m|badk K(EΕ)b}a!:* %S{1'qó+mwS˱8\q@Ekot]tiv[v69}.\fq&<`ѻ޹>@VX),/WX36 A V|_B(1ԕ`$n1yPѕ+cXsΈYE-o7"tc;)A?ܭ PQ^ JAF}s]b!epN +u>{[Ē ρ, Pz*D֋A9SNb %Ax"*+*Wtq#0*Mj#2jgOSQ%Hl,J[XM.XϨٖD!R /UD7E( ܺMbNV{R#7F[k!Evf}ivs5Zo;s`oa$Oh\9ܲw=iPZ(6y]JKKABU^:e7|+^5Rd6PΔ{*ràKqc"NnW%@+˷奪x_WUf1n6ٯnԓE߻h_o5Ą⼬WG֚PϝbvޚRqnћM3s-*\.yG23[\ 9Vʧ4%3bOgTJUKX妠IUM,s2(m gDzɤ8ʹO520.hx SYuJܷ)"maNuߋ4$wmcvd-ߞxP:DCbBao0pXlN`Loǃ6BM`B" 0WLe ["n>?B~.g7O;?,YxuAX4sM@KvT!qa02fP*_<2Y\[jl.zk]pPK#oNRɎgr<  'Gкy-Afn%BrB$<_Y?5zӷ4E{gG pN-ٺS&lkFuA}4Bء'?%ǹ^}!]@-8zNhHA^G֕^QTJ`B[P"ea6q`Vm"ewoA=ߛTwm;MC `Nfvv{>]܎j>Q]=v;5%bMs}iȖ->y ﬌lk AQOy rVP1@;gD 'ցJ抃ҕBUMi )7 9] Gu#V" h7mBZɐݥmuf!uA酻+l5Sչ{>X[@yG7*$XjWK hE%M"F+Xnkg Xt_qtfӾ[Z @7u` Ħьoi0<`*uNhhO̿s)"u>7$A{>BJ5ZMS-uЂmP۴CmŻoR'߶Ӗiwjѵch ңҭ@JG asX6~*xWB$Mtգ+p}]Ш'5֗1Gƕ Fh|? x Ay?1D3z&i-פ_9l¶ {-fHpu8W G`;% kN(-BsX9]v87#˴Ѽ;,h v?{,0"#;Te->)sFǎ.JF:ñXI' u^-*8RI1G,^h=uG5}}&Ng~C8kF5Գcc:0okRI:+U1\6n.V">. ЀZv.h8\jf_+;,uUn4!2jY3ȩByʪT`ag77ͱSríl@81Aٻwp=IɢƟgiﮛ,ުquKSv5X7H:ݒ Aȱƣ x}k Vo=ok]?8Eз+8W!f@eyiw6Zw8LOe::r{=\8y Zf| w>\n]~=<]}ğT]ه<q#n<Dj1ʌ7J,F1>GZꦼAwX>\~oby}9\>}#wZ3}bj<=d}LxN!OZU\J+UDLt(ʩ2TyBQFC:xW`5V~D\t<8WU j@ Kkc$0T2ƘtCˁ;)(PA/؂$  "2FA34' KΎh!Z޸h$*QR.rs4rVP)UƠ}Ţ5/.Z6-;7JtrZ%*gS!"$*lxBP~B .tC E @loPU4y;3'D sc=hT^$vX0\?ׂn) ac,eԑUvK`Y+oTOM{v%O}d`Fm)pS޲/v$vw}+jصc.˼*Yf,gXĤt^nX56ZR#k#!=0-~aHKRΎgU)W,$igo/Ӊ3UX4dCȜx[3ޏvF7wZjv'Y[)xƔQUym`ٍN4=XZP pgO"XD :"sPԋa']dlr!"(ុ]W/C{B4[OW x/^ߊکP弢U -b_on~nטOve+ 2O_iheekI¾Mt&.=,l kkkMåڱhcE^v <@Fo@h=W& &pxe{mUc7}k9*ivށX{c/򴴑%h{%><2a,djp?^"`,:ZN+zA ^^_3s]E:L Q:o;ÄưM3vNE>XtVɅ<7>|B4̐2%fW*Ckc:o` On%8)*~[ orR@X+Sff(Xj\͈6. nzYNW;$-K 3f<;WfDy^s*]ij%[2+1xsu-jճ~>DfʩIS{*][1Đz+84r;4rЈ6)J*P>k?+HF?@J-V()SWUVOL%ͅqq˪1<${fF4~N=R .P 筎zjz8P#J@{5̠4nr9oO҃ɥ dK6ܵn=2m+_֠NÛ Ή1[C#΂٣ o%čD=#~^vik W]Fz^]\U@|^=zwWJEX?pϛ\iҁ'3^_ mxE 򄷛ڀS|'R!Ba{Nv84/>Ј^JNqxw D)QKc"x(P.}(NվU[uo.Y1 0ΪKBcAy:@CB_B:鉤ڱ.qvXڦigv#t(OCoG~A|?~zhM|훇e|&`7wa0&l\=_7~(N |Jߤ?\tq5Xg3|?XGr)G7Tp* \إ) 52ІJpt̚NtRW{N垢8ߗe&tEI<a(H%(Xb7ˣ2P3u8 #%2GR'M!>NQ.iM$VO6AH Gs->A'2]o ͒DU1Z& fEF'PC(Cx$\0ueœEaA0T+u]<6~BB \8 ,JrZZs;Zn-:@Fc|}%Ei7Ӫtq_zL&tfB26-W6нy-}M4Q=SI|S\ѯ@{q t ܸehxѨ8J(!:Ң 3ZKMHg>#9ڼ-9qѮц3v;0Ja|b=A36j$O[-Et iBБ@^qa5:xJ/tqnEB!RR =CHl=EJ_^{[/./\hC3(JqSJ2uQyKF…q(aVb5J5΋u2a yqv6n#pAc4"ʱ BBH/"Wp+7q$5IaNKY_˫}™Fʹ /eӴyp-rq/qW:crt@83/mW*fB+>*Rh0"ƽh;4ys:^ЎtA Fm`[fuhI( I~0JƃѠ(LikzXR<?K)  Fn<#*N'N>tJ~qTO񈋵P(TGnEbFydĴ6Z0î5 :1Ÿj9U%(m>L#M@U aDS˫{2IC$yvdWBs( ʍ";QA(Gvj=ZCo ,Q '0Da(ԉL3uR jn!eAJ׉I~wmelmm8q^a>mhYF5rae3v(_2htdVQaF81Uq+"omhap$0|0 J)Aڛ IUZ:gg;xS~&#(4[?dB0tւo]tŁLAC=B3^N$_d ԒOe^evzj0#7p"jB׿u\1g+E爴at@['̃b Nq\~zBb[9y&PtlvY!?/|݇V(y3sAtfMF?fݮ{]V_5 ?F0fe538A_ I݌k#/k^=9#khx+$zfJγ`6a3<Ӷ[J.=LZ_{>rf=u;n}O]xgWկQв3 :H'cT qo0LP+n>|]V(R9pb@@xY+k `L'Z%yQh~Z^Z}h[spMםއaCu6x'Ն2Ehl|'#?OKjx5e ǁj ww>d釥N[w9O윴`jo/g[6MY&yGG_>t gD [$[6Y;R4xH)p ?ThsTulCAQąrKt N%;wr[eCmsGC|{@$I^.V//+0#Fԡ4uZΥr߷˿'ypow{GeZY𕅆_;c73~|^I_tcEyq— L˳3Zߚm(\M2[Y q-3J%#wYy2g%JI'~©e"sON2UY-8ר@q9hN>htO:vBv)Vq4_R-r}|יOp&| =1'f܂2/>U 4D%cS>1]R }Գ؂W9Tzs>:œs,4W;:_ܟ.#/]~nwz1 gu*h>ׁfcPCzT=l;Y>Erڨl~O_-Nj)Wwea9,pe$\|v=ia:tMQ}hAefXY2a?prY-IlH;;k+'L:x^N9rn}%WE0]4ݓsx_^R B/wu[#^;91ZM~pF{|ps t/U0x2|+4 _bJĞp{a5 rxG2P^<^<4ٙ1oNϬugMuO[Lګ1:`p(5U/wGکTkm0xw\$:M]P9ؑn]Z^ Z7;P-&gbY>)VdVH =i~Ppp% `Q b,+b" GQW]&jՙd C */xB{ڜl#zAQqCmTk&g~F rRRg49 7e_Uςw9c9yb[W찅T[cc"pqꇊ9/7R b',~_]}r֓j JӜ맠9TA6,TLbqOȴ sKeOBV(p{|9i|5(8?cU#A1+ 6z%ﻁK Po}NyYxBA2e4jiL0fRB4?|~۫~Olѻߌn㨫Nݭ_\UR5mhϲή6b^};cyRf?3V?oi7~~ER3[Ii#6)mJ6}y]zH1?Pn9oU%&\ƕ4ORk6|ӗ#>KQ9/yvӢbB,b D;ڹM^YU< (LqIha8y]E 8jU9%ʊH ŃfIZΑZqh8n&at A~D{eT"kGAeYhOl;C Q,`uA,W+k*ʒ ?Jim*Y K4| Т1f`DiTwRSԫ'¾o':*>eܷ RrAv)C+[>3_L&: "V֗~4L[[_(r}LM#zeS,giJ~~p@K,䙛hM @LnARZ.pl)) #jT4&?L!RÔs#A2Cӱ; 2ﶤ׈1K.1 B3Rb<1K~ta!DslJ:ulۻ̈-}Fw/61Fn\ޭ y&eS[FR3_uә9|_ք5Ù7ZvN7onK.-gT\Hym OָI:~kVNw:yqOz1;TKUın%+ 2U1۪2*./\Qt(oXp,05J֥RL3mB& OțjAHâo5åءG4Þ/I\=8iIjaM(pZk)Nwc{ bY8I\)Kl*T!,6A*(Jn{Mv1o(Rhl8\H':H/Ti#"b~ ;[|_1f ߋ!{{|w^I6KbeǏR.f fP UEɐ7B'6fhGi`}y&-D {'*8QLX ` _ԧv]{?f-|l~IULu Ku(7Г~S0ѱ=?{Dj/^9SC@[/i&9#SKO%}}҄|v:3| Hn|eU\Ü3c؀M=J HIgÌ 2e^ʧcS)?([ʹ?z3&T٢cg7yuv$lCJ[wr3X׷W F -Ϙ2L}x:w{lNs }ԣ (=`ۃItKՃ+p]Dר}]jz:`Cc 9:4;4&h #VF̐W) IG.u@+ ecF[ߠKoe,8զ&îqd-UJ–BR4h:BE 2YKa*e@<ѩ' )nܑ!о"ِ P*U *FJ]r#Uſ,iU֎ VZ $e7GjN}L륫U%p79sN ZjchIP@eH- jȻ_"yxU.RZi% @D6%jځtI͓dTMC̀XPt +πP(cK2ژ1+ b)^J8P2xK9]Ud?s`GA8BŃ{J6Q1 !7}FJO1mO1%τ#gn96ŃS(nDN3x;8=hց>_yt1[2WZ^ _iy+-/JZ?j"`kDxYRrZZ@i wLՊۊ\UPMp&E@Gꭨ,$ANVV1}].>M[~1L2LMϧ(T1T jmV̄zffUI혱tWEg4b~<3-6MLs{愎Os7oyƤ 1QFZ"iavyaa IFW] ( Jkkj WC-7.JY[\HLAcrG"h테r&>|>x3r1 r ixLQ$h9aq|!E⊨rM4:a6*W}n_8L@+0y$/"͔Bv#^Pa4(N^,;o7_rO2nN>|l%_]x OOϿW>:zvlҹj|#Tǵ{j1wf,@Apx0 {z]wޅ^.4@$%Έ 3^2J'\ U(S] Ck,$5-BeJ6Pr~WTɎ+;L1|;;/3B*+jeZjF9+$RXZRXb\nx$! bo23DvXQB.5C#\ [*[Vp#IUE,fVֶ<3蝏 0JpTsX[si[̈́ )޳(j&h+[+b2cGXy`j6DA_LU`6Yͅ?G}mX""oKoծ>q4o5h*x{ys7.O7x ajF,l.N~S Qn]]a[Bg'sU]~ ﮯz/.|8jo|z?UQ8]F7Y" &O-}GyLec-RB.*8u5ZQuO9c6ED@ל#tmoJWje.8*5\\0Zn`B#ªVV@ExQ ^`,T9Wʒ ) QW,+Ed\!`DS3IL1Pn,UPnTFWRQ`'`eTxmdٙq%H[=E]]]5o?;_]&1mff>6l(!C@ }-d{ 2F޲=\ b)T@ZqZF*&wh T˘4$QH<=  QƸbcڊ|3 [cRǿ3Qڟ9M4~2WqقHO~`HkphZ+bʸzELuq,-Rha4 S4ߴ*UvNxD5wd}NSh 3Y:-mu@ˋL3P9/fJHH\H@uT4%Lx$ESڕTS^k.G:xNɵDee}BH9J6QISÈj>?] >_!f-=3(Z]$`&/) glJ^{ e.LX.\' ! uU#Q.t2?&k| PjEtE{*s~MfHJS0+1 ǍAIxmxRbH* (+祥j.%EUԲQ WwQOV0髤c[;P m$w w p>1V!1}Ri8(*6DZ9SJѕ7ƺ˫lłb2ם @Q̻돮Z|=-tMAf X|pq.$cY K1J׳R)2ccLlؾҫ󺮙ѕsLXPS𳒂Kt 90  Ś蚞.]3ל c}}o<KB Xv ߻buaq9wO:kF2Io`M8O뱺ןkAXŭ3!kbzg@c3] c#vvQJTTUC06Dٱb0R3O?t&&BI$"xoIҮIjsbV0^VT(AV2J' 1yCȳod"Kdb"̌bSoLJϘqf<3b@٧_|6/ -e#n%!Tԃ:56-%*yI;T8TkC?<WřTG7b-pTLU 'gNXOił DxҩF"Ab-PotzY(X*HXh,3RgI%;'s{Ѐ50PDHr='vp%7ADW`xpTat8`LX9TiyBzQFp6G:Z𰑫(AZqf6jD"ԘtrKF!t4ދR APB:!kmdAֽS3 AU:ʠbt*%Pq!I" 9;!*oF0pP  {(K1"/B >)g_Ef*k\x^n'uΡZ B@jW>1H2ZCBqLm>h*G3>QTj.5 ,}\V?f/>n9-9盵Ik-{'o!Sq' -ӗ^;ih*곊Y0yƔ!v]9u0=DoStw2\# T2*SDZ(c#kOm.g}xUͣ]__qVRljZv{>7o>H"udbTZIxNZآ RDBw-(?`T 2ZrW;R[~ȳ_Ay3쏷(g>g y&ZdSKySv[U;VA>w/S>w+ :wnE6ŧZn֙`DVA>wq:z0V~6ݺwnE6%ӻMfˆޭ)}F6jܭvnwB޹mSBD pB,Ͼ/SGb11fc:VP:-O *1.J5VMu^I͓ڶ"_Ǚ,E/Ԡ.GWWH9zO֕V5[Pn4ץj;Eg`b7%ޫU_[H{%k)H? 7&S)T ѳ/w $HZADDlB^FQw Q1u8C3>3:\#nFj' .e| a3M- BbcKi".fof;vz%=RV< J08AdEE8襍1"ȣR(!Z!/K[)đaDWTk:Unz&ʻqr"ߝ Y5V'#q$-&N3esDl5 3xx0SsHSWj dw,B։|ggSڏfZuU 'x6yk۬IYQ63CG ^9DѨ"6` U\jXwY! R=a}60% 0j_~gV'b/A qyldK@ILxO`Yx(9rG&zyR>F ָ8l7mҠh64EihKQR@kq5-ːvK*AP=^ߊ%o;^y٣}AAa "jYՎԚL @)9\. 7*%Ofo_v!/~V,k^%dB ~*{k!\Nߙd A*qgk:#0x@zԗǁ^[~ݸ*/Cla;')ХQ+eWtڨC˞g8vg 2Se "MƉ5}ƚ5Z{p,MP?Gu,sC㾪![ ^gyqw^Ť7gߓ>t9Mh+-{9>#뚜sy}4BZ 8KPT|BBD5B+2D/tB_kS<~] NUNc],>KJx:16gR!$+>RQN 3>gG%)u>%_u _"Yk)9)EMHkG .k]U҇oهȴb1VoQ#'u"R\6Z#q3Q7z#)qU#n6 @ "D~D{C1 ǰAGq+BbTGJҩ[MF`3\H$43AZx* 0,CYCCwj!jzXtM Z %˟[~؄W?2s̤]9TJ\ c*Lm/~T9PNƖ,}1m1ZwrAbcDoŠ^ý]l/2=[7ܪbaXXʁ=?ۙT[7bya^|])MqUqЂ1 LԋO*iQZXyn,YciO:Z˝_+d&^9 &l\Y,IŅtF([?kXyw9yl9;+6n("1 =Ϙ2N'{ʍYSV#GeeB.kzHCSS#W]]vV]]>kC}\ׯUP>]xj lFרtQ=Gs"|:p쫿r 2XwiC*-"tc-CAzdNb"1qۡ. ̸¢cd+[`{A Xe! Eh4e'JSu:PW TюKR:-s<^uKqkȴa Psd Muh( a bʙ0BhS"pcˀ5w^uK7]K-8RZS mҨjW;Rs(UJBt9{:Hl_e,-oMINj~x!ywXj,=|nTlߋ1C?qLxXJ^C?EW@әԠSvZkr3.k]o]yN|q>sG}OṦ{GMĦaMnJ11gnc0E:w+n]X;7ѣm XYW(|_?0 %IEi]8NظSv+aToeW%]^)WATlwny]F s,7 1U9tDVw52 vWP~V}lQd^yW)a4D_U'w6UCtNG?~I'vNbMGpl]ˮ7&"I"$}boo\>\^P '$&7՜޶?{ȍ S2~ ;nt. J#ˎ%uO/TKR*͑bsxx;Q2" :Ϣ`bM1кrr\^ TE:jF;\) ??avtrvwbe{i9}|U뱡-$zTAH_#Ώr-w *H*fe᛹U~Kp0|S^pkfgx=k?[ ێx0r{Sܼv-tVu 3f./ @ Яk}%4ܥ.<MjE;8 v"Wp0$Υ})!uH_')-:3qbUGPPG )P( 4#mҭ[oKE[*RCF àpSEIuj I 26A|ZLPl㍍R) {"=gȫhV ˹ ڟ͡bVȈBۯ"b{,ieaz1$?kB(xŠ1jDqa}b!4FEqr9a\wraNeHy6\Rq9i?wUSɈ9=o(-hgO|OtE\.; ZDϟG~ZozP\" [<~Axn 8#O禇c k{b-1(hOE+w=5i I@$ZZWG%ڑ}\`cު R)V$aQ)  wRB buE[!&8˜_{v tdQGtt ={]<&It ¹s?<@6w_/|\Fy4G_P0 F㓚6SVUՂD ڟ1PU%T[]a DR2K4Rq)EzF>aM8GU-F Ics54Z* ֙Q 5raVHj9gRX* "J̩0S {a 0 TΩin)C82$fF vV(1`Ն8K+9y؍X?zzztiO?'z;u=sh [Go_ ,7Uq?qvg]w @,()&?<Xޯ79oSl~&r|}̇b{KdCdۓo!`Jf]m/, tcI@QсRKXuuIb1D d۵EHYWڣm L)C-6LVkܠ1n}TɰXE؀[3١DI#P~5ff390uq65+RC(ǪpoVG":#p_A_V% %X|[[s7)~;mu!%w,e5x=IÝq=%_K^Jh>`G"H EQ+`N0<_tD~-XȅhMA(F -LDnNlup4ND]{ޭ p)tHAĶQǻ wHu*KSO3ڰ 7rȻa\7w tb݆ۨ;5:=[r&O`!QogvgB&5̖ .q.QA_[~7_Rvo"QjlP t^(=޴76w'E~01S )F0/ }k Ų$UL4$2$QFUOw :!2?sgri/!-/Zk'YAǫ٣׻h}&<×kAW1zE0@t%6TE_nN̽qh9skX|̭֩!5Tdn1R2E,9F->qBUH),MOS/w`%u:#/~Mw*^PΒ?_j2QL9NrW*\2o-*GQy@_@cLƒedMAm nb ~]h3^E)Qp{]R8J m5\Qe-b6HrRi`3E0xGrX(y h!#S)QPC#FVpq2ŴJK9 7k1 YrkgT(p1~R!%(L8Gťl_%h-ڵYݛUǯ|}0q.̖ȭK- =2@!!yYvI\[@;$3vh.f\g>+8JL2h1@TIE}^ @P4Ѱ'u_\>xs۽ 49 ?;l[2 [CH@ri#dG.6 ohN1# hڛ+venF k'`f h ib9cMI +KW>HBt!A(My:}6F TM6 ;a 5+6^'CN-H )LgcQ"]ֈx479`({cW$=lz[dH 0Xr/p\f`,6e nl1\,Ko$b9A36ClU鐂EM~{TN\3.QM4!ZD&N}ZA4`PyІ`kgN4d"*apFI.t !DhdcE @qO9Ҙst2@eS)FD'VIZN7D`e ZA; =ݫL ;3?c\U C28 HMtM0ȀPf=FqE180X/Z1RKwR¢tCRI12&PPS(!Ubi.q1R͚)Ȥ[K%@XʨBiaOPA T2`J1&R2(ȚC*}z{ 5m7 ]f,!\mԡ]:r]fL . p)BG,лbb:mns-CC[r&ܦk ݹt1Ys};pL+ک yaVI\P[Z1* ްmPO= +5vyX~edΠe.mapSًo^v雗#ͲpcqMvk/tX7UCqUYwo*+mk**r_xf:bq(GL *Wlk,Y׮ds\QSJU]R&pd++5XVw5 1.hqw a~gmW*-ܻvPuCP?Ӟ={'*Y`wmDjL_L~++?xV?A":|? N4:V}H>a;vLgPuS"*Rx+~sb@1I9no?n-@HYD`M5aZ@ȘkP0[ M᣸zU12ծ22 ͵R %2U(")A CH,1XF)m` dh? غ6 vҀdAyI.R*ݕ{m{UrSκf\XPRK-pM~$=.W_o\0L{ߙy~;9X֋+,kbf[:k|Q+]xO8U8$$mn=>t.g?0Xb Nˍ)F`\XYc9~MJN3e!TpD! 2#SȜF!&1Vfd FRq1 jd:SBjJg&?^N3 V zD 9JP `J)Cύ{wq~qsÓ2s;dDlCJH7H)0Z]]E?zڭAޯWk+9]_CiSOk9׾wn@˕A`-|ZSߧ]|GE6!nN}\~vauho$cF;1n\C ƮT*!փ,^@^on)ZZ5Ma?} %_D#;#gHKD]Apt~6Eϸ  nh)XF<̥Y罫SJ~h"c4& IGE{[ }'ׁR3hmx+ e@&@|G+z5A#@KQ];EAP^& ޤjTv־ER{֓¿Cn诋m4;p_@ļE^†1B1JetRQ 3 2A@ ր)c6eH %3JrWs+6Nb vciZHJhMC!' vHI-L KDXgZ)dK:h܍D= NJC5 ^DH&b Yf)^h(SiLz으`"]\Cp0-*!~gy32. rVT%i$'GfFlig*XUNƝlyYz )z0;0ELBщP$ 1r̖su*δeq(\$Pĉ@&T9  n>'c\;x;4G >6/5HN tmL͙ʢg-p=ey{9kH8-Jj+;[2Gt 8C]7g<5kV_txbɶʧېW gaŏnk*-c݅oIEOQFhtvhj|]tEcfR͓||vvYFhA.A1;k˧Csd>lё:ߴzH~uԃa_Po87Ž?|fqu#Ʋ 2J8>8vc4]DЈ5>dY3F+-P.uQ3Χcy`4 ;('`AG7| - \bkhTRPmm޻dk)мfb@:c5E}] hNmpn͎*d s1wY’wW!MuW*u|fn8 E6q#җxmlm⊓/If0[٭3"7Jrl O7  3fgG%֓tzv d)NQPu%8D48Cx,;Ჺ")bm\P:ԸkM{7t"hUMbǵξ fLJ lv}GtÙcn0 C%iglb\R4!# (q6@aՀ n9Δ!˳\s]`$F}J j_Ff%4/궗Dl  Ay^8f!PY)mh[6gCQh)yVhejW\)EuR ^-h9U*1!ȕre)AK%5һѩ'S͈MΫX4r|}{1RDz¤X̯'bbqS3껲DRn~w|]9~UٿDcJU+˻n Ďڽa& -0aXZQ\a^ Ӹ2(~E]K| Y7('<no&Z5x*{ x!Sr-d${ 1L*Rx$1I#FXq37 [BO?kK_?^O޻2]l$vדM%Cl/1 lRV~ZU=AWH>7K]mDFG)bOVjT?TIR[^h!{uEOm[5~ЖGb%!zԵtAw38O8J.gcH,Y"S Rd0{ reڪPp 8cR<Ԉ VJ({UW#Z{6|Y>e3ZfEs8 9gR҂R idƩtdf2 :'\`/aݤrG4#V3ڬT! eA NeQZު IpVPkmF9K\pa`|FXi e2.M.e:4'vߨ. %'G`՝0(r2kh,T:Dz9Ʈ (raUGYH5**0yv ]E;b54/L`!~5ViN2,0*¹GFR `x2!K~'5chû {E w> i:D/ J>5݁rI_o w+X5CﰇNrsOGɇݧ3\d?_KgwSۭKWF~ZMqsܭַyz߭EQ?ߐw?Xx0~6v[ " ޛ?!gOo/&bfOpR ;;tnsK; 2?UYح%/ߞ\& 9fv_cK WbkTc>FϱTIA*,OIuAmH3e =uʚ[Inx 9V8ڸ]=#7 px)GT!(F^#S#D?ӻtf0ԏ0xw@C,D 7ܬ+ S2;T^{q\ m`ǝF(pQڲ3pV[_O'"-k[s"]H˶a܉ k;"ĔcLUdZ5,ʹǽj%Gp ;Nvoi||jvdv;PR{6,*|NI*d2 IRHHDpͥNjI,!OhE-C(c.ă1#?j 86Z>( \Sw xl鞼l t/{> '?Y:I8]Zݤ+? "QS$Q9&DrS&~:Ҩ"OK']N.t]'~J c@ H"@YDk+* k%)2IsRhDd;PTg{EWMݵҒBtKH7"L-2MffY[̘ "L Q%\Ppz1!+a1ُ%<,dÚKNpm\ BĈBA>#BcK%V f F1mZ%v[iog&uJ֊|qI~Is>x%]]e6CrYwf36yq" Ȏ~[vfpyv/\4҈L@B D& $ZK YoB7O7aV'йQ[*n>t{xX$I/ֳ spLOL8!𻜖D,n>ys̀Ri% VmvņnZ-Ɵ߭n6+{Z._zє`;)!rq0=A}r6ogWlMӾrb9la_ӫ2~ײ%77Qs)S1?W܂g#EOر=K 6D<(U򦧼L8 lp`1 9 ,0c)\0!r7f"ʁeqOP)\v"4^\$b\v/f]#ZA0|&ZsҽBPs*Ur#2yy]A#)TGH]meS8݀BC.& 05 grϹ%$X)4ˮV׿%ЎiΖ,:͚3\OPL)FL:mR:yS窼#wOTrA@S҄n$$\-u.8ORJ+-2XH&$\7+42 ,dQC97`Qb_R`hϬ=^HW/* ,M`ET3{I MQNӂ1hnF51R5ZeL1Btke%2¼G ZMqYɪBV_V@xKLjƸu y^ cͷ# Z: \dovA,)F ݁8ϓb~ |kȝI#ƖRb/ϼNoðyTv'0-dՙ~(]G  loeZ $VeW*|V{b|R7Ǖ o A 3wg[  B#FcԊr-TLR٦LRuejxT0Ǫ%5dsFyw.8u-n9 nS0.nRs'G<|w>VR_p)#iwpȏʟq c=/@y\qSnVJ7dEFƶJLpSӽ3!8061O~QqȟeLP9‰5c_h n dduN})ċCA!0aGo;)DW۹GM{pb=s2U䒁 'r5HT,L{,Nbh3e0q3̧*?\ߧQ>ԌI.LO%k``6$䙋hL14 P)$4v Dtbh4,O?=PֆazU@i]? J΄obxmNoxuUsuAŐNDHfn% -4k OIqy5d)$-Ooé.#"H~%H۔TYC JYV"Rr5wRNխ^6 NEp%sNnM!g $(H2Y"sU gԌq }fn-m%$"%禟7!WĄ/p`D(fLةr  waXW ( Þ اE{us`1 +IAW<yIQa砆`O1OLDifԔ;S\=tul?Ʋ)wUI񣥷OFYۖY8zv*/dڔwx5J_LEWٻm,WT65K ʟL*3 *ݽ%B#a~N$]R @$a~V8VR(Z4 J Ř=o 9N "r X㜫:s+& InY! qdqaXMVC&'iA:*>Kqm+W^PqҦ!rZQ!K<)R?r]I"BC]|vf7%$uf f;CߧN::hoG!)xb̪ӡFeB/Oޤ̒K֗EHj5l$4*LK >ߚ !hyT݂zȈzp.PGإDU6 zM.胋6l[YhO ҮȴƠ_pRqce%M_|DCw Jy3SiB<.E[+:0w=s/7ϋ_2/?61\l%0 ~n 70J9tz~//H3g`yAT*gbnBWIu jkKs_ ]4_BȐЗvwk DjBQH 8qd(J/G"qJ"w8ZRs-/q`!= ,̜,]&3gEGqr\B}?jdGQܹH0!)"rpWd;s `PW"Ddȏo}Q$ռ29:>0]+ROV&mSi3!V4Yp4]޽>{E.`T=$9BmI~1"\A6 <[q{PI}y"R2.uT(2"<8%  A[F}|XFUo`< [WN^ UJ|K1%j^n`ZX1d3Df$f@DSxyyN|s}sCIEiKi9PE(N4ba%e!Je^ҘE@GCqy>90n8 <53ݫ=>BXT o<9ʧ8YfU]w~>hH~Y+({.2lo85F>sEէ|=Av ')3= Z-(;>^ymn, yfnf Ʌ b-U;œ^;Da5ʻz_wV6zS*%ҸGm?tJ9;{f~2=*U_tBfMMRSCf2ȸz5hANc ",۾1*{`'\Qfş.;fwfj}w|uGR~[G|ӻH{'u[Y"2//zTkb2/Y_/EQsMTף\k|J01,ptzdDjwFѝ=<#k4+DG;-?1!xq%ޏ<3N(3BKvø> EddCpBPb° a&[JU/W Mw)aUrDқyAXbj@0!ʙBs\N /G P-ej33 ί6RG8xjNКX˰NRb`9XdDŽ6G"gZP9ᅴZ^d=IF<\"jh0frA Hs釗X] 8ɛ 3*R/m]4Zj̑@tCz:>i\ɪ2B7ߕ2mdzb_JfW>_({kW?4(~|xrO3!U"gӏ_bX;χLJ/7SNN[Σ^,^cbԷ-˓b-P 1$o&㞋2oD@N/D/{G-uxÁK,x{:U+AZ)__qjYfBܻw!R7;8oznopb޶-D 39 ?5)hF as1;!+oXA;4dbc3SskXVpRTX?[ۼz'[(CCHsTyW vge# 'wض|ӱ;ZD8L9+kZ? .dcQ@VVc| x0Yxzl4s|!*TILOl1W{7^?MR^\b\ q`h29`ڻ(twv[Csx-T1iʩb:J^k`Ea}C\* (!IIBO60WϿM; zG廗acexddxw\NN8dJTncR^O[6` 3h^?HV9ܺ1h<(#os}m;q=p.:l~ǝ3o$q"ʩP!t;XW _ :-vìJ}ύ,Icp{VI_5OSC0k L;tp0h; j&24OR/> E%Tk,_ݺ$ 7H7gy!a׼oa>"Ο_wD_qƭ~UMQZpq_yr"8' 5,hUN$+Ou~80;E8v_!GQ:n S۩J\V:I -8;ʀP@N`"40YP46I,\.z$=r˅̗;s˕-ィFKyXr$.Ђs=|"͛v;}`_MgF?tiJdW|` *L 1j;$0(D"XHdww3%ZuO\3SnՊOZyoɈܸ5$4YKi[ӁE " y@Z5Sg-- $U_P xp٤ttg!­(NyZU MHx dImJ{#Bl`xI:-EFcbfl-g'[n2Wg)12J|}2yucsE> Wb_osD;_~Ȫ 'K?|E5JF|\O<\Voz.|~ɺ^bDX5Qhi7W!y1Ε$q 8LqNj7*;lO^׸4_=S@(\Q>*d- 9n!,@aC"%# (R5aDQwև |g8w4lߐƖɟx:~~¦?ooQvv)j7rLOO;j%2Ά 뀃yXosB]I?'LAv6,B!kthj e7 )np~AgEvWsE~_IH vГ7u4 hS}}_3bWw.ʙ>7UΆ,BNJE56إ[\Aŵbazp~cOb;%a{S`D穘b6k ̑v LtzAe>F˗Jpȍp>ݷ9řCdۼχ4n*˔l!aL-8=;&kȊ;ƍ5\cm62 Dӡ}1*e{K:yʃtԃ6'캵Y۩Y< P@~A4_҈_^"_ =@g@̅N" B29=^l-;5p}3#Y`4R e"!L:eVma=Cy=!x]`':IbF3mrLcגX. +/s&ϵ$mr Z oNPs8߼͈ͮ ~GVBki>geĜtUhQJ2n K ph:2c JԦb `Dt ĭ.Qz(?7] F6j'ʴN{(7^9๊{n}"!g8k)m#)_BI·)\snq)V*6}(s0i"Nt~=NƸP{o>f?8]s/uiѼ`J>%Pְj~.zVu~ QxJ 1qՉhXІ: ڀe>D-8kdog4;H@M~J)glf@j}Wh986HB0XXʡsKtC3 4g|mL<{ѬxtW3ALy!fWY?BZ`G)a4W$Xt2]aHG}$\?<⤚re!zD$|je9A8NۋVV m/Evw^_5#l&7p r j;VV 3>YY3B#?ur]?c:-AW>jlc?Õ@:B[H?lzټl-^}9$&zc 6oqիOk^j((.zQ}/srhsʠfO#^,_qr}b(DtD0ywi^`:{i<3A=ŻW<~.;^&{Dmn +}5KU\J+>d>YEinڜ\7a񼙻ܺNz,og\Mj-|2p+7݂,zP8jyа:mV^Gj:gՅFXY?zء9Fq9 evD!jg{>Uc!jiҞGrMTs}VV ']s_/_)0ggNLӴSB& X+3NvGz??\M'$8'sue #Břԑ&-8kҚMSk'o*Fdu}Ɋ99$m!09Y~۔[ C _} #VgϝSb^Q>yQxX}E=QWiIZ;sՆ`!t(ïVZe5d%:VsEҪ)AհRUc% 1!px6Bpj`7*bWC@eT\vU:Ir5֑RC)%&, D6LB+[=p/鯝 dݶ}B-&P3nt.7 Ʉ9 %X[s*GCx:v/c~f${M#"HW9eS5ťD}釧3NDVj cB"뙈'%d&N: kIhEY5'Lzxj}LmZzP 1#orVtPjMu#E~y8U\h5nX,,Ι.dTD̚yTR:ȪqphcԌt[$ Xx@6^f`UKRەm-V 85o?|lwZSjS53BynXxa̚Q|[k*]p7d "4!.#@R0Cax.7&Y26SP.!%yFI_f !%YHr>^V `Ma030&6mr`s;-5Ԃ8.!fEI68fA(0GN|q)+5N f!ev۷6Xa Ff8>+ǩ²舓.H(Q%N ɸD@%QNrAyM gFV+ũlDx3 l~# 0U>p-);/=Eu12H1vd&WAR[FUSɈd$FЊʻM me(LCjW|ZUDR*ɡ(t׊܄aJ" "$U'm];0"jT+!+SkH ,2,r>8"5c۰ZEaFHr 'auU&Bˠ0]EoM i8\VStum9jj;fC08#kã4&j3#}h &&58ϵ4Axt8& n).lUMz{aUNvRսߪ5/ĂWϴ|o䍋ח3,C.uu2J.?zఱP5F?īOV|Mbl3=|]~/6mj?rV#e:_zzwjg{^uxpy_.!Ǹ7́#_ +?:Cmm MDIvFI.#w<E|MyZvb2DQ"{eV*,?>eX}R@ҬIiSEedcˆx \炈ρSC*6<~j<^x6_Y-(H|٬a 27v[SY|:Ig/E[A s4pʺ_0k厢g;`Sd#>iJ"7kX՛/;uItjnF=8:`WAxAf:}R V4k|IA/X GJE~cVs~<|&cBي\,j4U6E;r L6ٰ ph=*[coV'K)2ZuSR$i(I؛hDPUU :`-֒I[P90*Su6Բ!F#,Deze3Jc̆(du+?F`mb@-pFjɥ"b7h:hhX:Sv6JV7f7#)C@Kӷ J`BP!d5d8U&yrkjhp_ c{_,{<-1C zG|\Z7zS V 0kX:5YFBX+V`t_ǵ&ɃV#Gt||t]MxK\ugN[Nvscu}xxp]xhpgw x0fQ8Xl_g2cy񕃿{JJugaA}|Qq5Eysvyu/2h|gTI3df!69p0Շ7AFLz[G.'(Z:hO n+'n ONӮo<NrHmyǎ^~ZYiypbodت\aG>~A5ܵGX)X8!S1Xcs o7^T¿Kշ䞭ݩ6w_STK D`u/km7u#Mt]|J퇏-h_3rL|=Z|%咐j W%ˋISv(īxX.djө^,k`۱Zwy/߲&9oՊ*K3Ywnlɟ|i>6U=3;xE3M+n%;$r# &˹SnSec:Scw;fÃyO\:MaMB:NQ3PBϟ^|vsxV)\񇗿˛~ٮ_|,H:>2CT^q-y.>2*pBY<K87y*CCj+ jNn $qO:cM Lhϟr~q>b2vvbmQίw{EuZI\l0s{{ux8A y̓3PjYI0E)@T,ܣ)_:&1,fG$ 2!z΂3 :Fnb:aoߣvI, A 1e4l/IqC<3muR=RRx@uRYOWA {wj ߱6l}>]FiN \>;:lwJ{4pPrϹh3aF =ńܓNrzG\ͬP+ZRɹFP gm%1̳,nMdfULqv:FCPȾzEV懙Ya:2{bkoȢKhL &\9+5sQ)q kr@ek"p{ۺW>ݣxS-k'. J9ڤ2dA`@>Y eGoƴt`>^rq$31+BBb\@ŒS,gTxy9N4 ^,u+l3fڔ[5_RR)xc 9{.$]!x5D'DĢmiҰGa |߇Xmm[Z?:hR-V(h7aNL:CИ%ؗ)  1zpг–talN7+=pP*!U#wNLsc%2w",1Šإ-no$[2z$^v ZB!И)ܣB&2qsRhW[@8h/wJ &C"1 rԶqfN0%R >hKw2hɓzB$;Z vc=DSFR/6^s(|eaE:o8Z;FBDA۳¶v mE8 %Zp2q̏=+lchdZKBx f:fo[QV*ćOYG;+}0ւ XeVd{* *VIV|ǏW[ֳ1Jqf2P혰*fwi| X vCJ5Nj YaK2y"y0Aizy=Ї)I(rut)lo:<8ZO_si!u`ZE:Oyq,v]Μ^9}u+l=][t7N`֎;}"G:yk̜xD\BPfQׇ6lXiFP=KPO\1P1vu\i쿉C= ?F8FZ}qK{c[Mް[9Ğn]5t E#R'0l$hzc-ة+lSўHa;TQ\҅qܢH6Ozxz()e%r&FHB\0hHblo@s5Go&P7#M8W8#g\,PV)fUdKLƠh;լIe턼bTr1%E<bx̞lS*1,C*ja7E r5F bS#!Uĕj#`*b|xjs ї4 휼?(n0);UB=)y?{T*yR?Rva]-!Vc(eF ̍p+҄oXO17IIY5H"%.uH E]D@`dRYRl6 @RcOO9x`oT;Z/VFhӁnN>kuj011t77uG.fxWm8u~*6t2 fqc:;w0Y~#|u?Zbg+;m Q73Ymy}o)$uL=_n6F{ r L-4_RO[;8 ۬6P!U7K=1w1;99p h즓X#VDĒ-6u]8!kT5FbkvIۘ@Kh^ȶ:mNJiu-o{;YN )xzKɂ(n!Gj%NJ@7AiHFZOٹ[4<c@! PjG0zV5y$l7'2m2l*aMź\] 3߳tQ)nJU I`}.PS sv5G:VyOs@]C԰|$%Bfg9fg>iIdƨEoV)d(GXa">0Iʐt[ X< ^eA"@WƼd6?b}埆m,7ByIW/WWm__OkѶnWcdi!!Hn~ձ#Ӓq{֫͟`1Jk*V[-* A q&VʺT DmRWޞy`0,]o9rW 2=-$=dr_$Ȓvfl{Rk^bg1R^,VU۶ LЄCӆ_ނ.~s`?4l~%gc^ |2PUD>ve2k5gw F(,)9|͗9#U=Res~]_kJ7 ļ+ Wt!f ]<,^,Wf=[tAF5zQТW|W(>˽xZ N}~[(mgЍ@a?^-T\B=`~izu^'rS//cXMhA#aĿl 00Cvhy ħ̴t]n̞#+юؕONC$|Y3xv ڙL 5/<~깝WS0i{uz$ݟ3 F*8Cz.puF(no%myhqbT`SP ^PdzHԥp,Ϻu۰U{/> K[m8Ż?mmtWHT"g8Vފ,?GWn<Ƹ:/kat \aUg=Rq#[xԃR ? 0əQR˄R=z%\9'晗Dyj,${x4X #A5I @ʗ4猰A& T2v;M<EF=)*2c!pln>O_]놐(5Dz_=EBNn&41yPR\(oOX~9#lqt" 5Jsyƕ3Y $ja=\,X od$lbp0)1/ THm71wʩc^|[jԾJT ,6k5&<.XLl !pP 'ݟjδ{-^ ZPpbt&h*8vgc9/Z~ATx'|1pL[O,2Zu79K`q۟t WXr؁cF1L*i@3њP zhPq}}), nD`Ro ,KJb}F7 (!8yQ݊ +- Ue*$OQ8G%y\Bs]Lt6Rǩ6*(o"8osʢIEgGb%3Dl)E5咷# #o47U1#i)]d*JufSnëK0IoeVVr.i';;u7=KLt+:`$] StF-&4=m@i0je$?]⻨rRl'ROoMTD/5l 9o2>5m#nL04f:`&Ya& 6_w_S=ů|3!-мu^l1(,5-;_r6=7N]Cw9tXԤ|<&WwRTYoW>vͮW ިRͺ~WdCX_Ƽ˟xf:/6}'BTW=U #4-8SJWWv6q e9h̕SaXM$R떬" RPNN21\M(ȶ-5OWjha"7\={:l%DHh;=)SƊRa5'ws$G8=+ɵ9Vt!fcEO/aiyap.VtAF%*E:T;6Vtv18%=bE)%(@ - h)) @kD fBR~qA5 -殄9ř%1Δѕ7~re?_;a3D;g*'$8T8b|&Bi4fGԽ}B6^;E S1+ϭԻߜbW +oC{ 4\xZ OPtp/JqsKX3PK ̸T{U̴aj+D(ciO0&B& Ȅ(֖WϏsl9grIim}dFFbhk4pO^`>cS5ZQe?zk >r0/dB_2u^ZH&.g[!2+51La +uq덋!p΅@1O.x4@10hZVdF(ϩ-ZBo !*r´XmU>xT`ςm+ m=X[{,B?{W#Ǎ_XU$ȧH?Ew}Z{OiWE3#gvf]`Ȗ]*b=O" =P΀U xe *¶nlRe=JX X~RUWܲ5G )ݸNk]fct qu-lMXW-D=BK g1v%ax 0@^{umcZyzjn:y@+}0zq/5!={EƶDho*]W5664&S|E6 W{ER5Mm u:-FЄMȐJUm}hu<黫UkCGnMtx81&4@j]2]c TĞMЭ q["ŶZb8t@pCPWbkSB*kIϭeR=ewpt .K!յ2Z/[;.^{Ҹ~6 )cVY;9AX'Birb$FS):%׌apc pE- o{^lyI@ x@\0xxhe#bDۧwoy[[ùD[1F!b,(s1|6&U_-Nfgi 3ˬC|V¢Wщg0M bck3X^yx\TJ;)(n^a_19OC?qi>uG_,Lm9L6XÄޏxq#~)>XKS7O9' sJrʴX*?bܤp9leӛ>˃Xe1/pQZ+U;XQcU:9in4CsG`0,/ǦG.t~WD ~~W 1o,Rh.sh\7%30^:=L갃m\ Ŕ}LiS3{ZB28E/"눙DQ̋K =j9+ !lTS5J SxTh?:'2Σ!ѭԫ^ՖU{tH7yp"u*Gl6@hc[ @0Fbu,729c_Mg53M=Ҙƥ,4nbo!Kv9frHXh`m-mQK"J9#`* ;e2Y3ljґYˬ!4~sz%.镴VיnLʫSH< _jaD0 T),c̶S`k `V x^vUV!Q+xK$ G4Zg.:hjƼ∍ y)mH_Ex)IҖRx6Ki{-VrW"P<|^LdLc܁ 3Sm˪pE>+? FXQPgrpba̐DzVio~CK YOBe+l&'&R#wU+jkMZ2qnSn9n>%C^Vx)CRM喽358&NЦ<^S&tOjVY~skHLu03 6wa~ӤJamߑO-el1w_V ـ鱻ҟwo{4ʧÐᐶj#dQ.q#X?[>wo}8޴5b`O#q[־iV}\|bBЌڛ`i]D^ͦbs@ &,xܒ(V٩j_f;UU%(e#%ϗ7m,[᩻_"Hb%¯1 "9PƩ:ϩd,δuMZh8&4m( wbtQ\PQs][[W๩1JAٴʥ2*mS5Hʩ +4:!˳2v7/wTwձxuoM:WVg~&ukRAAXv+:5)r[5{)xtU;fzFtg5|*8%}WT,mzNMt6(x; g>GmwCk݌󟅖rsg,P ($S;UǛ)?ŔmȒ -g HG|ӳ[ZEn^63{jv~8ey͈{տT9]޾ʙuTz2ۇO x'sEN^y 48~aw̭)53# 033'!:I)dSc()l ļڕk\Ε,T* ҄b1dɍF# +Ζ}rEp9-V{.O='{;gvo׆wYہ rMm)W`QQp`p6N˰̺xIQt6ˍQN6ROκ(*g`# $Z"^x\~RΙxl U//9QIS{TT&Ϸ~[! qLynR^ڴpm}O˙cW6B}}?pڨȇ:inҤI&Mپ]_}ԧ0~ "mԴ,aM?wDDs'M݇^q}a vhY-v{,!ݹپsac|lTQ;( `=Pq ٭v y~-g_GZ+8MbEt-[ku اۮyl=Je]J|] 'W xtؠE%Ŷ#8%+:=@2 ۣ^|C:G;$\bdUȴϝR^9w'Ev?0aTk9ԦVAO>4bj(V{ԏKLu18MÕ3A4{@D6j/UӶ\)ONF4ekLZ2J)ti|%8cM!xSƀ T^ *4N8J3%Kڨ#%4Q>"r{l{쥩56FƸ+/x5DM97ڴ:k/eA2U &*O,7R[[5qR)clPQ046 |0,9*u]a,+,;$DZ)/rI/q{[;ZMoN}U`}#=·4w**?D QcX0nTu#5CfyM4ҫpGzmށJ$lF^5΄h~| DPʠZԸ:{͹!+),4iR-9=*nMN23hRV+8-ӡQO<s).^>$:ٮaأ?WOܶS;b0Y IeiCT#Y7:+j 6`8N3-P&6,P"7!mdז3#mJ䍑UFӘr;h:%lVVD^VVZ tP2zu_s3s>i0 JF$HM2޲3ƴm5VV9Cr10!L ˂74b1A-ӌ笑 84$Z-tYCȜrYƶE_f1>nk^P 8I^տ};9}woӿҥڵLs_&yMޛ7]ݽxaU7u屡{on,1mXbXLWNAepCoqq}Qn>~*?_{CO~kaK.t>irkCjZ0"صysb[NP}DFZ#.kѨo۶fqʫU@R\p+^*"\ (У*1Xf(Fp.cG(BdЭD#DkãٵV墢66U{6d-Dmdc/%P`31QZUth}]~~xoo\pwW6}Vw?')ԉjW?=Lo'vm*&)\]JwԻmjA37vs 6n?W!n-R=|>>& M}L8ѩ'GAUeJ$LFK_' ^$37r1r,5X^dg*wJb:Mj.Q`h2=8Ϊ`Ym;;xGù8#J,Gѡ3tf Agک;eֵZdܸ VnP3EAun:AG Ӥj16ZrZrk Fw^9x{:h5xtCG^!`Pqg3*>m( Ӏ5B#:"51XMc;!2lpCJ,C Sp(},Սo$7In|u.6mUN0@U,AHZӶ Xג(4օ ld Bv xNN>~ 6J=u}1 "ݷݥbɁo]o15Zxj|8AL %x<)Z[ԫT*# TG4ـ-`n6BGUh jLhR m(UhZZ5o4:KFS}Ҡ44iFsEWw"'&Ny_*>]ɼLE㧈I sv G ;+H+w͔1?`Oœ(2W s6ne}qZ^Q3][+^gJ<Ol68O}ttb-ԒZ[l9N;w7*VU _!A7gjʇ$9ׂ+@+u}lƹD맷bL mMsd4KTQJ`޺)#.@uKT ]u!{E.?%k e ͅz"5$ϸ򺱻d${U}KԑJN {RqQ}gg39^jAfPCE#xFe lO'J#C<$ƁzբăЮHk` ƃ0̡`f) AQ"@BeS I6 ".Rb]Wr0jwCaS^LI4=6ΚDh[g˻۱CHj|G5>fS{.W3Nj@7ut+˾aWz"(v?5č+^s*c],Iq>?m"-nٗ1ƍKyMe6Jf%9SG:Zx 4WG&sUs tR˞gKu|R( eS2S&D$4 0G_9t 3e!(o_s͏F@}¥h9 w,)U\ծC`)SaaaOmFAJ C $Mxo{ CqĨ!&rJp-ρ*SV] &3.vBGIv?u1B'ƽW= &bu 4X$ibGʌ%i$P-") O)q5Z=a0j@U7ݥ? rQd"M4X(S+ZsH*~ L)$I8 p.2^(9 d8"9K,Jfd,ˡ:f#6"C6xVj?jQZb yJrj33H$IbuAD2CP-RXaS6J" snت*B6Myup\u]TE <3ʡ4 J@){ EBYpu$R&N'Fpu+{+?J6jICӇ ,EENbq~~љ}&OY4^>}oPQoՅ9(YM>X,vN$F8|π߷cQ3 sH pqlgJ a>dt =*tᵾ]^aj_qз ]xOݝ޼b6#DC}5# kK!yä́!.pTVzNo,X XR('D6K -:Dˉe_@C*Aqjc~e2ppZ {?M&c>W'f ?ߣJ=iS&D$rO{Gsn(b .li<&ӴذCm,HdaLhU;c{5=uS~z,-uvH!oᜣg¯n  n:RXRz)!RU-SM0a$Ujv;:WD:c6O{[OvˋΠg:Vojkt Hԟj}fzw>Uv?cM;t--{ FsjLKm){H]{ -Ay'9sMW$+|-KP} IȍLꞇi7JZn<01hB'i7%ڭ q)̂8,fѱvkA(޴e`Cքܸ^-S"*Y`! sb F yx;K%!m_TA^_?wGŅxHkIZ~EzYz:7By5j-ߍGf;#9ی} \VC1+ Uڷ< ^ coX۾3قb[ J57B1BfV#~IUc<!r\~aWV=Hxpa!UYZ OW\{,v!B;K V8qg)ԏ*HKa,boS-t R:cyXeC>E~}  %tmJҞ}i5lBF ٞ\vͧzR6 &IM)a -֑ ů@N{k©o!kG&$Iuoy<9/Ct$g^ZC B!Ak;ɻ6!;D` 6Z[&B^d\  BN.Rن0 qHGu4T?j: VظixV9ITϾܳ^I_} ~y B|#*tUH ND;ޑ=Gotyޑ 'J i<](W獗z%s( ZS>AN%X}ᇳh8[ՠqL/= L=dٴ(Ke> Ggjk*V\o\F}-@lxXK{;1 @y}gPZyT #9NJoSydi_{f r&Kl(*}}n5(#9(\!ם&A8)nS9lyih^Ti%[~Ug-uc(\>bsWz3pu8aMY.l*teQmHZQCPplq=ok-/Cq3('8J8Q FIN#d s0,@pEqvtѸ;;0a B$3DI1JI`LSV=NYE9DiTHr,:_j 4-q%>կ95@8eOe qb^y|rߏlp6/)K~|=#ɲ1WCQAtqW >͠lJp亪0Z!Ű,C͕% uQb~b½ uńS.@b"GbjeLjp""/fml`Dzt3 ms`!scy|T"xN%dq HӁ̤2Cq1s85Kz@JH*o5狮jJot)цgCW|,5wA}bn-#T+$P[ H3V̲ >n}QŅzHaţ:Z<տll?F-75`U!z`OԂ0pِDrIJ7))pAhzvlBD@pϺ#HbZ}ʺ~QONL ծΦrBJZ-K}f4YflԷ*wQ˾F: z%Ydw\FrpzLz?IįWS\-h4^ͳtr">_Np)h|qozO5&jelG-O}=jň |O6LƥKy] ʒre] l$„LMq); [ fX!A \@AQf> J a V~R7)'EgIfB]A,TC"9\ Y5Jʔ=?{Wƍ _jI|Vmm*~\Xtl~!% p^8r,in1 1d*zf&F8HO w}T6hYQ^#M‚k)%0rU|H[,-dT$#߆4)kIR3 YXԹ+\"yguX^ܙ+\p{' W.UA&uUdR XrA>>f[>NTDA nuP9DP7&{b~|EqNGo18npkPgZã˫ 39QM軇K$\\&jM1&;g7tGW<`RUkqDJPwثV;}/Q Hڥ4cD]|?u;^ R"0 9 F@xNpRsU9 2T{kΩ0*b>)8A1c@4=Auvt<$3@J#r{ ags9nʄ]lu]웅u1Z'8}ƽJa܂,sOW14.cPv[u2Rr(kJ90`Jax[,P3/BQ/prZaxAJFoRJP5OJ2FM(!qm\l" jKe6)$-y(ͭq<4j#ɅۭwZ:p .˜&e_<8{Իf(JHEȜYޢYޒ̑3[o,oiXh;Aփ|CkK|jvP!*Ez*gZ?[U~(.W;ZIdTb>P[lrns@@Cؒ'`3cPY]%Ny0WiX,K~]MR~ZՆU7\-0L^UaȻk^/hxOɂkָ}B6࿭YPE*;r c¯>4Ћ;I7d?JO]LjZhT@[˧"wچ;BDit}en/HQZYkB MITGipK0)*pJ{:!1X؂2>NgZz(Zz`,W 96(<؉9v99si-"|n-\طa\_>oc9i 46qqu'\ZQ_^3S人~f-EħNAn1_.0-Jfij2B|m>cj1ohs aR"3 Dao~18>X ?byS(9(w}U!xXT=qva9P_bcpOQ(0qn#`g̪zǖ#-U&wr,cVZcUӹfK4ϰDc.IM YY(m %GþcTh>~ǩ?Mh.,VyϷswuk:o*˷,&p vycY Q?u^[ЊvaSfBϹ,,}j%`eB:JY8`ɕ9S#g:Re,S asJWZ͝V)yNXsJ̕p|K1eҲnkU`Y{w.,HU )]ٸ ~,'Co-iG^}zpwE4DA2k_}ŲQl 0m)UBt}NIzUmo!',)$ly֮k thkJiA.[H L֊ x ޳5aN?aNS2֏'0a9 Tx~u$PBN*3DIZ#$PO`qE`Ykʔ/YH. 1Kr8憾brWUܽ1Lw~ZvOkYEmX;rknC4?wܳ 8W{wrH:J( >ZH/I>a[eRg}[\:@ݠmJ)5VzV!J9̔QvcVԗM;[I[im܌iJ@зϷ os?qY/jU+,o74gL*u/P;u){ k8}z &A\T iې@\ؐ<#v2h<:, Vs PB PAf4ZB,,{I!CEkN(bAYTeM3II% #W\ 8O2uoOl4Q쨆?iP,#I],( {F #H},axR_  DzTi+ێF"焮]082-틍 kQrNzAZJۊ~(lYeb-Z`nd {N(i |EJTiSrjlV*e|24V<1 lYu 5,}s2Mx|rJ Bv2jhSsWAGId",s^XBE*ym3L (2UD>B!e ZTLHq,vTӿfQh1XssWB1S `R!rس ,`-0Nf$e+vJQ12 cٞJz@9J˳JV30AH4虶67G6ź$VLnyJ*+-$%)kۨ|(r3|+y&,<p:$WJXɲM VEFH5c uK4 w~O=X%x lLs4> ~W/^FnK chlkGG?ppPIJ7hwguLgkl=km,nǎQJ˜z!51).j\Z7_@PB kz7n(FiRjde#P*Y!q pUJT;6=aM J(+.^v+pHRZ55g@yh:IS!Z{* M uͭ&FpAjy`+7GlJ-D-uANrqiފ?$;4V7zVW(kz3?PM92/WT[Y͏u|fr_R};V]~Y}h|*aZߜ 5+_-%tvw:m64ksΎ,䅛h+:qzۻ tھ#Ż/"Ldy"[MM qny7|[KDƖHݎ nX 7цME^:j߭vW0+|\/n>T{N±QbQԍnj^uQڍ~wc*b3-d]do2z ˦N{5ES"g = %sB U+Kn03B(&ϵǠdޕ5#鿢KL/UCoY~qGHeFTQ.$HH筟H75YZxZ9K+q,L@ 25]\G)ڿh`E`:>v]cW.Be]9q.r{sQ1źӘofI󍌍u>K9)=P7m6 @8q{O-f9C>U_y"S즽 [v0}z]9=}Qa!D!#Zj4ZrZZ )[zǺqM8X"wƱTHnBI9ݗQ=|A`2%"kۅۿF!7[,J&2(L{&zɟ!n8?L3 %$2 H )0uJ%, WJ[36R::e;`%!Er3SYT0%;Pـ 2TArϥGzvd*cN 2. LkMGѳ̧cOV(f@ӈ z&k=5)&I1fI\: nv}$:v0wwN`u)px|)!M ;9Wn^]t.|%x20KQ3T1nT_\9T:8a#lF7yB)ㄡS*ozg5?rMԬb*a,U[Xb'Ňu6N:O} k!ͪΔSxpMͿnS-y;#Mr@qMv AT:YC"D_:JʺxT ydI R[4۝[jDKJ@3 QuATf@. *ͦmLIvV*4S%tNm1mAR褨Y)hlFuv ҪOӝN#TXg7n:1kcF8<1Z6%h+ھq_VZ~?>?ߧ!bBMevq@^&b2i2Gݺ*7,s9oנL뤤e㑡w̠x##xjwg'ߝ|wɷl!A!JB󁀈F paaD9 Cw  q |ݎ6 ž]TR/j+MlG׽e5H2?,kT-|]QXT۔@kHHGF$ltA' BMd0aJp (#EBd|g,;,(O.׳;!xɸ.1 >I. ^Jsؼ06 At ߮~#;bNmDI7f ?:&3 @nu~Oӯ7}b ب]X]$ G۽ɛv7 ~oŻeI.Ԝ8bIt_L[v`3`bW{t芽%H"ӋuC: L+2cd(@09ȰLR!FHGF1"TT365yZ 4]cBn^u4O|D#5>'*bXCe8`~M曹X_fB}"{xܚFQN@luEegkt2ws]v[jK"`L!s$7QglPqaclRtzڒ,r$v.uGGb)NI0&%*S{ut3tYC73U yk)D0 %!Ba^=󓚱WW fO0̴)h[4۬|Z@37AOG1cc\dt液t' I+PMxzJ{ 'ųW.h}eqy;c}q{eɮBluԊiͬEHt NlG{YD״:UVV<,L-K8=ΦCs#v(S-ƼrsQMFU5>)P% +w`{ aVbrnكwIiVU0ĥ@+)C@hUd!8Z(#Eh$$A% XSZ|O۔>k35o.X8kt[s @Br9 "_ɔAR!@DI"W! ҋq[6MwIѵPǁLaZ^oԟKs:o?l*r׶p+Y ]ZfS'oa{Lv`켒[wV^/2PN6I+R2Ew/:%gney":e(θ"k[vBB^),[|hr[YN>hs"1Vzڭ y"zL )mj 6<[ nIJL5ඉ* ;Ԭӂ剽Ke(ucVnZYE8M/k`Ϛ :B͌\_=-m~ʎI}Ċ S1 P*ZyajJ|2/'_C"DJmzL6NJ/QJuRǭRDݤ4IeK)nR_vR,ܤ+~*I%J)EnRJQ7m.=D0O 􂥔\Jեj)TTSC'-I)~a )MJS%teK)nRśH )=D0O5%@tRzRJo& nſl)RKCR/MtgKRz 笔b܆TTS :{eK)nR QA3$eb.RImy,\{b.atMnfDIF~F~xt0I)m/a!uz.Xˀ2AQ r^B3V )}UJd6Ӓꄕx2q0YLGs3'+ >eI [4؝B 5 gٔ٭MhalvNv+fprppv*V7hUϳΐvB.q}bt63q.CVOU=#n6.mBW}*Rk_nJ2 $JauH Ej!ZVN0!VJE ,NZ_G[&s23}ot<^,a8Ṕu[ݛѬavL@~oŻeȃ1._Gv'Ę _L[v`3H홾#YOW΍ػQFNr[cЊR%Tl\W*KbGʼn-he;<ՔeK)nRJRQ+<4[,˓Rv D"iz3IKtR9@Lɩx4hN4zpJc`\bn0hh ]Ofg/ͯBģ@JshW"2kA" a;~!xD ܄ V9ke$UDIy#!PZ)h"QpE!"H9U#~Ok (*q(ٶQ/ͿPBz<|~ C?R) "$0B$gz_ieOvW/pqfA&p60MX-L2O[d%![-vbu!r"1> \J]-x~5!?'?|!S%~.fSؔ&}ۢᲙgd@2YFc"fajk,aRX[Xk`[i-Hs@,1g>1NL 2W,akhndi֧KP rPxN 'ejJ5*u3,c9 sYc~WFKYYmdDϲdG,HYheTLbHR 0@aI(ϒ2(J%Q.9gJ K*$Q>GA1 ػS")u_ `%v]bv9XLn|nƵ4]M?fYtM0`u$ )rh)~L0wIy6zI:#-XOI75*@^̏|F}vF cka#oOoF, %]d;w+<f5I>xw9%B+Bt xM o#"uM7o͜\Pas.! ,9-k5ܲijry^QLN۾!ԞNM뭑ݏ^!"JC N {ZBA0o -ޗPڃ#AFp+s'gol6$.eK<;װ%N&.z]|O1ߓ>ʜrN%WTsV*0D|@;WN *$ Me7ot|?k0f/n-UvpOkcf0۷cCw,){7+&lvA}GvfѸpj#D-r!D_ЏRNJuVQsK c(u,suWgp(,^9㠇C d4RgP^9~/~sHK<^``wrd]mY^wfb>.{#2ͭe]Sfc,eJgs}cM~'wEO('B7΃zV}yX-IUeO9[8:>[6[a L9| [Xd4+KCZ,뤉)/x}\mSIf'RRs=>njԠVk^ߺle3S_ΎOxp? >8x3X'c VkY#m뚎iЮ5ZKx8g5IÚnq3q2Y"ԊaJ˦@u,p@b=)>Ii6`3l7s)-dOU vDžY6?gMB8dOքw'3ގr`͇2 R7T(+b467PV%& :f}2+$P|5&Jt>E1l~|~Bmk{u}~owl8eqÚ4nBbB=7DܰSˠgcڡ5]Lo3nvbs[Î&Aoh(R6+XggsBtӝkXΟ티3yt:~m%V8OyMhf3D,ŔW}^v'kQbv,Z>W{'| OA<񎾱Ew3u#02[{Veߓ-UTaU=^,-fr+̑(*Z'$Ȭ"BaI ,cXY;4ةa,eJ*/)C-I} V[aޖ'}V dA1 iʿ6,{΢]38?4-"}0wQp%mv4(! f>R]–L ׯb a[' Glՠ#E~Vj;P$jE#%eAܨVn-G0koK>&O 'T \F{W[z!yp>ݴY9_kS£ }Y5y[UV _{n]UgƫcΏ{*䌴dN/\52H˄-9:L_jXwYQɘ =L"1ǝAPOa5|n ߓaX _PéI 3q摝1d5Ak d%\euCUzQ&w EkUR]3~CL>􋭆 k`"oMw+Ls8.|t]jM/pJkbv._D+ݚ*M aCCZGWõ] oeCX9]IJhZ 6@؆8UB׊hKAyw1E(:ə|<SJgNj:o nvV%Rii5Rt4I׾PNi$gP_GU1섧OM׎`jV(JIb"|sH]º_+޶CAڬ˃nkPcMDcJI~z™t!bѼQF4sq>~7B`zS[I]ؙx2B4za ;z=ɇ$Bu'24^)zF8I$=z$1`%%||rnn.wXǷp^?LNG}2_Z'&ߒk/Hf)ZuOq-PZsWn|٨ϓû /z(] `4FtzFewW"qd޽|oA~iߦҀ1} 'oA3'Ze~1u$w:DQ;eQQ3Ĩ*kRʸ@7eqV7²MH5Af?-;31]!a,7f|\y] ̳qx@I/Tf= Z6w9H|'Cˮqwk/Q~׸.ϸf:w1uIK*,O3JQyjHN-,C8K-N17|_?/R8''u_t/f C,hM\6ЉL7dlk`ՄjĚNR;4RF$g1㐵>Fhj#M-9MA B -T,S Ը#6KSs]\82F8 EuykxAK!s2@KpJ,$Cq1#$D d.39jrW 4EgAVS=M&FSi4ZBq$iEQAG"7Ng1 Jui !s+&4+=Z Iw"V4~*h,r@q7ijᒃ .:IPi!M, q#s"ZsuR>qd-<@20KMMJz`HQTeJqR"r#ga Rp2,0u`.-Qp׷f&XY?37&wWO'v<گKt!հ͉ϯ/Tq>+zꊏ~߁`.-!"çoK>lmGq%@o'w#\1z8Ja1Hnlx2ݼ8f2@L*u _b2Ķy%gX)!|[,5M7Mdlsn62*ؚNiRYAGWaqo> &ae8AR`aQ"0mF K Lob6q6<ͯH7֨.&sP׏gʞf|bW7b.:ps/fq vEi;lq5TDctrܼϼq27}C}LD{qΐKB*|,4S __ܫ8,X?ٿ{;}O$f$8vMDJ1BϜ\]~5u]S0:B搳Fg'ϛ3s4OO JኡpJێXImKƥ)&h8n+DrP27ccɌ0 TZ֓$$Qm4EK-d.c1ݹ+^ZȾq[ M"A]W;'yUl{u_?֎Yś~}כqMA;r]^둙fl9N^˞vm޸WsK!?x S³7+jPbuBqvktr^v ne\[MAbuBqvkn8n)?=[=A}ymV51:8J5#Vv mv+]> o<R 7 E)ݾ~T)bOw&/775x%`+)?*S>Y+ Qriȅ(}^،/|U?j:ޛᬱ&lS"l&!B9;)5EHO4AI BƉL%E#v<&E~0%;Yѓl|3)IE1[g\h;Wi#$J&Y,0t^XeEb)iRvЄ'aFaKRHJ={N\ZՎN/e"gYDq/Z#ZLPOYH *bA~!tg'3駏8 -䁫40 H#cH*Yt,# ErU4ϐ]Ȏ,;r-f%DrT!B/e^+/,s4ZFK;E1.=*co_ ?Qٞw\8M`i =œGK`|E2f eQT[^(P(j@ P#z) \2x6eOH/淰^K/v gp*ɭ_2 Y2U TZɜR'{1\OLVe$=j[/a$sԠ.l]'qv0OɜLȲs~zxw>vlUMpڄ'q%҃0pzNWnʷM;yC3 /(vj8:Ƣd[g!vk\%bruYVnJDek9gU-E! Osz-S?Br *@4Bp0-k=j@?.ΡȊ$uy)5EvqVlzu| M1i Q1f1!"qʐ"DW_X 2~4@/1@KeD$r+/i)/Fqg4eV/.صޱY;HO]-}zKP_SrփTr՚_JF9/d Z2v`Oe~>S.brPܨ 1(%qBĆR TcdW::@BKDIOd Bdޭ8LcI(lHy%P$$,TK% ֜`cp6#ǂ+IdЭRjazgr2Bc=o{p]ˑ6 J~SmA y"ŐZ )L滐E PjIb-0n4X[ 5[ OD O$}*MNL4KXoSیV~4j\3ts}Ogƕַ`[ɳ p'3ٵ 3k9狫+x 3j0sIu_ :dY"S%Ck/֤K6: (~ٮV <۬yOu5o]>PA5+sO$ rL$aBD1q 8N1e[I[w>y!;3fy9Lj6𜧡 -2k0±5fh!Y#)ʁ$Ib(ZsInicCs 5q ()Bb `q'N_$ 3(6*N  (!Q.rHK_.90y<(@3W y;ꎞfhav+ery-y["C/'&Cߪ_w@,(*t>~xٵ6d:[0v>؉kQ_{j:¶g|8;w . ۏwwI҃.af x?t 䎇(!Co-kxH[oQֆ^Z(t&|+ǀQSR!t䶧~>>^3JK{UJeIPB4HfZ[G zB+B)アK^RAۑikxpZu23Nq˾z"qL4\ZМ82ÇAM-1Udn{;g*Pt'.* 弼Mt;7<TK"?bA8F4 Z>iOp'F@ ,Y9G6%jdY9ߒ?j: NkODX 33Am4j/j颴ˆx9tpq 䱦Tx%R&isch[ uHWYL2VϼDq;?ԋ=4jnyl/pU) ~$5v~qei zj Jq@QUsC]Ѫ[>a.hyZe'kM2xl^pOn$.D8U;fZN|9 c0JR \E3pvU|l)\MҩdĚ]iuyF{\:5VOBߺȲo8YD&sI1&\)A8-)MsJ CQ*%֌BB$O4\+P8>%Xx+ - jyOM LtvN;p.Ѣꆴ]F㱐_׈?"Q` mY@ce;8МUwAPNhd!hF "@T{pwRYD4RHbaq FE$ Ig?Y+>oٴ S 9Es!`$~[^ y$52HH7kL/6ӆ VhR6q5[ Ţ+EU$ZW]BRVMVӅqC8\s y6 ۿg=oh#>}1,9<% M].?J*2aJ2!X0cyc#M͖cl%()ybR#P)RHpL,X&A#$RƒqlF 3`K))8M̉X"ĉ <&|g7qAq Cԣray/pPb?׉g"=22Tj'Ԃ(aJ HS1DbA$ FXv!ٽzF`5;#P(ϤGKLgVRBnܓgB8OKAE})R0Aئ.*CA٣u`IߡUKs.ȶNU&!BwC|MO=;1,D ,-[r=̳RIv[ŲS߶-G4b۝.NHNn!w@Yu3!Drc `M5{܌ wJ7}?6INY i]Sp:oˆw{Z?U_R29,u"pd`$^Mߠ^wiz½6O"(F{ ᵀw|:eU2&:oյb<9Tm/WN럣-TάL`ڨ1W2p o> !F$߫cm9nځ5]ޥUo\ _ӕ7t n]HH="ʆX*vcb(E@=mW:Wٻ߸q- 9/FSz!銃DQy~yk'kq{IjwmRJZ{s@pfș _JxS6W?j#W=N궕J.oKn?Xo5TF9(vIQʄIݦ( Ju*YއJd*J7 krw^<0@ߝ}N-XU6KfeIr߮|0X}Xz[zjGt6+4Arc޲Ua2މ\ χ}D>m8܉ ep4L6\sz4f_W;8!-^+LTׅމIX(g*ӵe%IYxaki^lUH.Èz,x0`6`G V`U"R Y`[f`D?UpdQ{bU@V1* W֕-jULV0UI&QBsā?_K/~_R_;O{%A/:u.lGX3j9ǘ.D>sO 8|Ïqvq{{GsbS-AC.biV= gSQp9\C8|+^ o>4>V`):7r:N{[F"V)/"M.>G%=2ĥAiݕx/;[' C H8 V9yz,؂I? !|#04_^p<4.BE!=a ѲD f0ܐ2U%Μi [ aXEZ)"QWpF袴z_$Go%kd9\~|⾷;SvG=^4iC|G6(gLyV KweAni7I~ Ҟ{>(WI4䕫hN\=u.[.bT';RtWaYBH6r)#mF Fur#źMyƋ["UN!]I wkvH1~.@]SA]S,[@'PaA]FWm*|AȓN dAZ-]u?wy4]J7*@@9aa&0K!VT"q^V_P8Pr-4gc3; 0F+S"ė*](΁qs%D{Z mPiU:`{ 8>$ǹ&zvRLJ(sy)*wʪ0 -εb ~l@}r∿oWQb/AXsu>J@oP(+x.)mqovLjT^hߥL~@K5kfZe͟0B(8JDbI+6޴qvˮ?ϳRE\qT\@/0Ș|qN2si)fEu5 ٴz1=pY\9}8C6gե1.~VdOH"ӤclYIӣKq7̊Llơf 5R%3G`YiLML70o:!zof7fդes#I*o S͕GuvhiOکLnNa҈VLPƒ@v@[ -ZӚBCQc+'u9pgeT1Ǎrdg(xfheYԊ)| &~o]n3uxa;vu}|e4lT @XlHI2da%l :vxbPH.(M6x.-W l n=n;&Uzϻϫ߂$W]\>;56lB<yxS]L`:6Zjܛb{'j_ɷ{Si,Ѓ{H{,HGs _ QJ(*=U(%I"D8 ڻ~Tg3;-5d_FCb+s_S3bްsƟ=L<0IlD皗h(G r ]@dGEC=v P`2*Bs_%" QB@^ JdGBg=e Z0TOHS-il@_'KZ3(M #eK5ƕ$5~جZ<7S p 6DMors`$}lwkuCƴ>A(dl?n>aAՍm>|l6GH) hun"wG2$~Aniwj~-'Z\ҐWY:%p~z˺yW*`ry:ߑb&L9f1"U4KZm[uZ\ĨNwX h^CZ|tH6rT$$E&9pgܭS1oS)"q_FTB)/T׈SQWhc$ oxRKKl("I[j<](=#Gav 'h:-}.O4:&Mej(OZzZ7*ҟVfD5?F$/!ԔBfɿYޖ !G4˞钂[!ք(VPÃ?Aq{rPc}3IC4n(W@ 7p`Di,Z^sTqN?79*AVڅ(]-Lw7#4G6X_?O cD;^Px?;;FP{YDʠbs;E`*g{Wô*⇕)sXl|^?a?~rVv9@5'_'.# SlCzD(Ǚ*~.f߹)-o˯澰O!OԉfB* O:ܐϋe$*nduL+(tҳ?]Ηn{sJQm@S([}UZ .hA( $zIefxT/*Ow. d/> qCr5CTBiqJ^Aঀܚ%v|vSvulݗ*F6?l? +G?~^B-2 BsxO$GU<|u]B[|h#lG1䦲wȏ :0KS~=wSwZONzx{n-XQxwDLi@4dy H9G,8VLEKg\i>X9bؒvkGQPInRV&EHH $8xAmd[}G[!}![%WcmQ̮hz9T`O} hLT7 [@7T#BNpխyBqv”ѣqnƈ1T1HNo4nǜE`խMBqV1.XG|nHk9oy}y.מQãQvy}yv}8ן>K Ioם7G>q[}j)XVP QNQFn5JOi\jϑi\Vk9FiWZKd4 qi4C|.V%Uu9&JIV:8SFI=~|D#@~VfiTJ#GRЉVHEv#iwE6%e)TI:*HHBjI6 *`5 6!Kc|)[2UHǻ3elus᭛BeS;?ǃ0bl(|946?Z^_:q\_n/2|u7.E7vt'5;/ V>1&y%z!\TǪ/]Q*{](!sy#Ԗ%ͭo^|]_\"5odeeXUxY FA=~r־K6RA'K @lzp%m rX Ӳ6C>r/}|l϶vG{Cf7#[);:cC2V˅I̅Y0”7 V*$u7MC?y2NSE!#%γ9Ya1  !-L…"G{`!9,v  JB+[ 7ҘC fr]iuw *4}hq}ڣ0Uq`DhRU3yRGOJ'ަE0uKlqң mZG&Rcgއ0KH e&%`6! p8nν-2βBO ]X׋R!E1 &%gi83xlR!]$16o|(($~%L#+$~mAB7T>YneèWN9i%58I:%@ zHt{p€;7 k4xi.%r(P%Rǧ93hmkîq̡vlsC))vmuuڡk3UVºkQX`p2c"fxЃwhe.Hg\)+eQLrmCYB-JpRP 2W9Fr*4 g1&Hl r|n4皎U{;;6 (jo01QO{xh1WaIʚ̜" M^e·? !a 5+wvz%DaoWTzs^7S|D9)9q!R&W"Z1Nh؂PJS|$MdT+]exTBMk81a~0^ܧ^cq%P0hF|8ax;&WMkHv?67ѤĨӂ H{' N#R]|5KӘ1UC/aA}!'&pdͰ. SȒFa8稆ldaӃgq}6̨8Qp,BY#5\ઁ[C0AP :`9Xp"Kڨ>ܵQ,6*^aoÏqJfwZ6%\0fny\)^Xm`&偄bBlɒN{uS4Z"SFB:k ;:~p9}D*;<J!8D`*tT7+ZGn]chRݎ8[hխ|턩,80ЃuA t~Iu;,"t[Kխ|aJ%bJ-BezL[z' m€4>K6&< i">0&ĜLm|"9(cb.2>5?S8v. ^cS|wNSbo3[f }Lsq >('Y‹y4m,QGSg@} /}ӈMto uqT&1}OwOK?>+#o8%݆CG'( ZC2g"0mEUV9Vh52g'KyV"0Jť\(rN[-^;9BQI= nT8f>mJC)Rf(gUjeJO2,jm9zELA RHD),$yZ jE$RP;-#Pu"Qz/D ՜Y#8>V hj['ݶ@AZ PZZMLLsF4 DsDn5*dJOit*8PFlVrB T%nOW$øT%][MX?Pz(MSdQ"FTc4D2|(R0e-qR0i(-( 'R4"k:0ZjRzZ:mOC)0 Z!CrW^ooUXR6Xˬ}Zv0=JHaztwLah&J{@q%zo5{E#bw_~7Vsfn+2Yb+}!cʝwoa90BZ.m_~{99o1RQY5I u`ȩ.5 !S C#CZ cOS=ػy]4)|.]}Q\+4ɋ &)jn$ @%Ecsc!gs`4Y/@"\f@,d& c:P) 4NI;G#J!-@Qxɝ3k,z'P? ߂ B)[rP&21<@ftzcӅߤK<@f!B8^ѹy3g^T>|&&6_f?"4r4~b{卽}vC)UYLx=NL0]<\&3Xg [B ` zK*0NqnF%։Fb }!NbmJa]N ~;a8J\hNZh8\og Z73 ǩL^%Rly R5~AGwJD}VO^sQ a R?F-R<=^01ojѼjݵ>jI>{nq`)zH>00zкb]veO1t0Ӄiy< !uLn9q~vvA_ pofg,O\y3câo>[iIOCbJRכuHTՉT~?-6iYU`HW,P d]įDy+r\U-~ #U? i?sڨfpKhuc 11\p`ʏ|KCwH4*xme?!DjƈǞy#m[ɦZRۍǣUEVaGk0`qv#e'9t}q+ժwû{?}ޏ#~w [ﴣ[/܊E{E i V 0ǫ^W)SCd+g0JJ<`9B$lEvhS* R}'t([oHk(V0U>KMs/#Zrjw:j^%#~8zU*,Z M3x!5'] DD&1oo>V'YЇpRnnCC,ܐXwW?W7Wvisyʭ:L|xwM|1"Ol<5!b2^տfm]vxur}LzX$_L1P7-Z\&zBhp+wluѻ :HnE_$nEޭq`J2|Ĺp -W1FwsEz>!Ĕl5319Ę7lL4fşM^6?c%ԧe!/(­)"Fj)ʼnQRli5D)RJ'3J)?V|7d(?c%dĘu(*i&RS#G)8’V$P ,Rq#W|lNa$P+~#;GRP*[L RHDgqZ+ՋRD!A܁UC `Aib VPXf>LWUV'9!lC~tKSz[MOͮK(z糿^zeu+̫ --X|9k.d^6-/')G<ŃF-ݫ[BCfBB9J-nT,ZŇkŇ/HzѕPA{]"wKǛ='lڏ7h!l[<;cdQ$i},Xf[1dj 9?GWL>$֎dH#Is _GXŸ(%>^? }D~>P*;l[@_ld7qT7S}(h4?]qH9NhBq8E-!ItrXVѭ#syNËG޺9oӺP:9 Si Izrm95څclfiwCZ ch`%VZ*B8,j`EŴSR* xE`)M)(Jɺ 4#u0ЪlE*@ pK U`@[ (嚠ҚcGd )٪b"Hsl]$CNH)JQjPj,@UT%(QZ]r 9VTI3JQg=rUPT}\\Ӆ.%PQ+U]0Q;d^X5# *%n02}h&ŀ$QG!(L$ EM`Hꡈvz동थy#} .?ˠ8g&ӃLרSޟB!7Gs> xVn;3c8#, z7^*_Jؓ$IJ">is o>OKT"-<%>.LgK<+K,ΠօVَot6:=LAB&2㵣4f:G3#4Bb|NV4Pɏ[(1Psg*g: E3"?qˏbG}3y62-zia2˧;"x&xx?Of7\ҌbL^2R+S!rPiAB̑.*5YBz2?fn {Fic=?XpP5w EcU:S+E]X,tvH\nw~0M1i=C02ClNL9FI8R% l ӕX4YƅyQ@Z, ?(JRFQ.D ŜPi%^GW N.J{=bJJ)8tH1˪Jn ~VEPӝ[-sƍ\ܞT24Ǝwz睜l>])H_b:6߉\1!6y(Pb*t>1SJ3!+t4R2Fp`#P΍彔9NK}WА ~՜P;W! ,kջ5=yŒ ~aZ`t!?캣(h^tTbIkx^ڝ&.pGsP8 99غF5p1i,'@Ar߅ D2yam'օeӺf]gen9c~){  :—nsI[kRw/-[KY 5xCs5U#Ef_r49_W~{4YO~n񋫖y2JRqi>?^^/cレ-K?g-M4; B _TA!"[iֺe}>& dX~席u!C!w!8I;=xΓ~}3Qqf[>Kj'{^0d;ԊW&%-A{Wb{N{K!L;Rv>ib.ZۛϳQcoCH~ ?w,W[wW?W7WvYv}ybTB#yo<wGCw,ަ&;alT!r}LzXƛ_ ?t)E=!C4 SFM~n^s8~#Ż{y*6wo@h|ĹݨEc2wA trEhֺth)һ7ѽ1dN+'yr28Qṹ@pOOU2~m!գ5b(gH֒ukiG՜z?-72gj=|ڹ|B- j)* *,"(~rXj@a5"|GqWcjqM5_ŮB xe/ ct2މxX{ގ-(4S'4TlzM8w@9^F7T c]KYWCɥpeaP0]K-04©JX}A+$eYU} a ! (R,:\"]`#I0+E/.|'煽 ~:< [}o٣=0=oV&yptX|['y>?c;懟~kqwwsS66M⣅>M˕,q3CbTY"$ΒV'WD1WU8Zt+#(d\MĦYq^E 3gſm2Oi=!CF3vȲvT=Cǀ\Hu ^v"]yD&-lڞ)Zό`řqŠB0;պI#%#ύ2[m"Bկc(Ck0 86Q9{tݷxdSˋTFf=~vro')̦{ZlVmBt΅)pMzRhiĿ[v cی9*iSH&MUJJ"tfp@ J+8\PFR]<])ƹ0[;GLh\媚Y wTLJ]3^WՊjw J HQ" W90 ]h},]}z)~/z|խ_?.o"VSGÆIJFE*n%R;{Oℐֈ3+5@EjbmX;\3id|HR8aYT4_^l,%9:&Yek^{'l97%GYֲTU %#Fc(gD\p H*0ǒ(7y"*H0>a8qH39o1!HW ŋ`]pHѨwKdG]Kj445Ѐ<1 cAVN\# Ad+eVhw(z,U*j54`/Sgͻ{mF> dh`xOm. ts(8H12GvLR͈q8>A2⻹ǭTQ=Z_|xڂf@mA{@hۗboûNwGn:N7Rxь `--һ7,L)9bCjڽIǓ(o$񌶋6m9:8Ovu~,Oy`*W{QO=1o]FćZN-#[)jZ]EV^ Z'G>UɆ ?O.hi|b|1 J$ UPhkeN̂Q,,,W_+bN#(b xKgѐ6~˅緮fyz[ּXRz bj$bY"$i,)XĆ>/H9m 4\:Y(8.W;!3D YZbski1#Ajӭ 8'_}Ot\1/5WOoW/hMu%2O9 pFңJN K4/@JO;,\е&\)QTa1}WvQ}L5!].VjQ-~yik==7/mD`,m;;(C94ae2 * ̉zC6 NW掜9hVғ5y"ڈ(X*7S_|y/46gqW[\~ۧ^26Ǔx3I#j]OOoj(jP`15/`oO>_}' 1z71Mz7Bxk!P k@nхwwmT:uJ&~T@x);7S4St`o$Cgz(îo4}q}ڪQx)H>"ܟj$c'V(. tϷs|zyPMZ߉ڟjҴo}2զ lw4|!T}  MMq'Rz7nmeb:mĻ/:] \{-Z qaXp-e98@!I4Wt«]OyZc9#xJ⛤ x$?MN*o8goi6T 26PHmՙ} A:SctK}QOWт`QuC_ELKF I&1x":7[ S N?Jnmv:ɔQ}R\V,jat i\&hPWdFEM#)Y|@3 {/nS5 &g;^Q[ur, {A fM-ǽ̓teR?2:\!%j .\|6_ˋ )võtڀNK/gI~Y(Ah%e/,\.~4Z*C#MM[K)0Kʈ9x^XRsἕR)K&ׂ1DH|+eSa`daU B =Cc*yp;A ZK}5No|A i{DauѾW[+ IUc9ʼ`җΠ%0.8Bn-"'=pU7xpM>`tJ\OF'4 j?f' :-~]waTAi/Y[QE/D?1pRP+4GN+ƕ1Dߖ7N2B:O7 ҚVvqz9PUU1Gvp۩YZ+ qI57L'aҬ Xy[)&.b)OJQ}BRJ\NJI$R1$Ӳ;ý'i)&N !k8AL RZ"| Kê7>.:Vg\HL]zB;IEqފF9TŇ~=^ⰚiBAOMuQe^4рak.:Ue,hxDmHmNmY !1~̍6!pXRX:oׄ\Y[Ι!Z{* JҺo9j*exSIuu(j#@c'UQ<9X2 UiDKynsJ-}!0 g y+2D s ɱRhR*`LBz\j͘(9/܊ U UcSu~NQU6tOWnZ*,f N%͛tJ qg6aJYw7[i& uiCCM+'fn<Tf{O-lԴ, X::u6`'lWI U`{eaҔly$zkߗL]T*S"yV7 ɈxZ'FDn({tLFp+\cӂH) W_EvvF+([<j[h3 9ckwpFtԥF1ڱroT,t< \!;Tѕb9ZIKj@4κpv=2 <2@%u{Mlj;Gꜽ^BzenvX_ݹhOaWuuk(@o[w\Gpp: Z>],B dmgxΪ3M6A(@ͦZB k(7'ĵ[FzYKY}⃚/,$u{&֖a-á:ZC:cd^qيl|3_xEGMF5_9'NI2?'YS>YMs0 >OakӞ"{&;stt4\ fr:Eg#uw=B^ty8N񅞾7@Ok'T>;kB޸#B1p1gxB]-WB0Йw rnMX76ץvz&Y]|t<̺N;->q$`"3Zc"^[}XIؘt+#Zjb]n=#.8u͈kB޸lӑwܢѻbb:nFI-wkB޸ٔrwSEznN3|[K,S]Nqݚ7nmJ`e9Y"1j9a) 6 d2ݰS d.w(M #! "zJL؇}sGC3}ZZo1J>veO43߿\W"-Q܃+}7L*:7rܸmxW}Z=Rܮ[|>F*y^qmٸf\ei&38 նԹwRNfo"a3HR:&LqD<~I[O\N?==IA= X%&UR9CJe"zꩤ!J-iT%V.zoI*VOߒ}iR0}'uQ4 |WRxiSǻ(G^Z⛭LIVJ=Hhz"dա}zZZZLof|1iHj dk1R(Ycp2NKP& RTVVUr+jR-Ŕϟ8zrOsRQ3^ݩvل*k /ZEY7RWSs&EI2腵ΖtsuY[Ojj، 7u( q !C\S1 qw./DU\-a)=0$ 1DNp"90i("4‰Q fHVjx#lNHIDI%P !F,z> < b G1ńxNQd CpcL`3#ˈF 4gT44MeDGږ0 iixA@yNi3 C&$W2k7h508*;",eq$e=*#tB}Ezљ|{0_ؼ:ys(}Y/%?woo[s8Ex{taˏd~?mdohX6%sY3K )8$t?QAߌfr獣F0c;DI2 v7_GHKǑĒ3$~|-e7PI$*k}6^I#_Yߺ$XT$y/yQ٦Z>_=Ae vp; vRoZ0ߗ#п}<[GunZO+d++$`"@#9 x D 3#fix3Dadz(Xs,o~ownnKԦ$sv8m:C :Dgö;K66 ֓q=#1 HCԞxPJ\0š(Q^Gm+=f鵌`@IS "LIc3CDh)n bE$JN%ƒ@I3m88J8L{\Q$D #*َHU#U2*HU۠%q]%xi, i C^";~3?lh: "sI$9uw  DXڋ\ZmT=dV SM$ :Q6Sm: ;]K VB&}Wwr1 QFI˰޿@ʑ 列0`XS;}:dϑC*?? ! [rD莓םU;H b>` NĐz[Zld ë/5Ą7V=E&SȋMњ")% Зز[jbү8>-NrKȱvM4Ȧ}MA=bb:n=F_Z֌նw /pnMX7 Ȼ AOX3z\ED#{ ukB޸l;7XI;ܶOvܡ;VƷVd}kL2±HAlèdW5xOWgQ ddw|C&2n撪!}uVlƪ!T5i]fZwc{v-s̃-LO7eǛlgeoc9>6/gy=J|e&/+'Gڝ` X u?5vxӃ~$B_<٫~mPQ^3'h7@jʬ]/xuؗ簥cbs݄HʭEjV*r)^BfO`AX)RsȯTkI,J+'ᘕtY!FXBa}4%PJuJTBA*9ŧ{*?XMVI9mR$S) [}!v]: a]NT' mN41%@,/OID9ʁYTMSb,Q 'jfd,͛QfuRNoe#(R|'U<5BNjF -0ATW!߲ M'I/-ҙB"!{xc21 qU؊F$AK1DZ qDHd1$劥1f"XLae6/T?YNP4PpӓEI iex̙D'eB҄1)IttQNTcHf@B1@de()A*DG4*B2FPSc)!0A8>%G{=˾ijl ܬ Y4OgLM;; 1 %u$  &I*ǜ"4)U× ! ΘA-cEF`zR̠,S°+3h4dz,Tu@K6siܘAs'@A''g/ЍtBHƄI/OQ0fНyTr`q.p\|+\q5ݴ@h1qk ;֢G 1n  4DtC֢5A֢๹ uF$B>d- ;q]ߌ14$ . ΝBmnh;.BW[*hӯ J a:noIM;b4BQ=ZMi&/IS7u[sW;=s֢' iXN *ם~א#B NPP=X!RC·e1 Y|FX[j5'H-(}o{ȋ%6Kҡ/?Q1}O7}x y&bSMX9I~. Z_PBfMDѶt|7l Ï-F&oofr=qϏgc6~0 L`֛q>mwdD1c:XM6 4"zkf0zwB}ϼc\ujmL=' :;MX7 |kc& T Nd=o,qWI5`!oDk(I rC)#\ AgYqlKMŵE慗QK*ڝ7R*ܬ4"VzV VיPyF#%W+h+EJ1T08+=%}Qjjg\"T:ZhB/$?*qF01&_,\9$&i'L!2DJ)aQ$"1""LbA$ *s! Xi]gmo~*8au.$1sxͯXqڒC(9h/H{dzi׳yl~XF`<ob4VX ^u6[o?hh vߥJjCTb^vN_E  /YY=g{δ*пhVtlR̖M=*#YiȼwO$,I[|Đ1J,㑹9^!CDUsLut?sTT9]\UYw^&j.pկ:I5i?vnXgwY\9W,\b4QQH=K͖S*6+ 2OnkYȓ[z!)FtKU;Wh*m 't>{%1ΆUqΪ\UtQ= b kXbY?sݜD6v5%ۣ%:KZnHZv/Hq7.qȥcq f.}=%4r6_ӢY3|eL9hGt递2*'zFs$`@s21|d_բ]MC=+oo:wWsn>]eP7i^w{_^,9cqf G{7WMz&.g-vqF$1z h"PUlx}6nVlPWlv !ccp& jy)kZ͋mxe=BiÆ Ա?y.mp',03Ku;|vx=#Uc`ڼX#lDM <)}qO nh;)|X)_ZHeWٌ]4.H_fy۳;$xtWOk+`B0@kc3WSnMh t(TpJd;![*ODfI`u?tfl޽9fϞTq捳}nރO!YiFG:5X?J"kz7Eps7r۳Xa+C_7o4`Lxrf-^?sKI.Y̭('f,(d 5TX\T1a|,uF[+@Ns`T.UK!,j' UkY81iEXҷhս,- ž茷dG~e pwa`,i/}"W_g4Ktwug2AwycbP_vO|f89PV0' ĚsK|7Xu!_y>gƘaiSŃSwݠUbYl{jeIDk-W\">7<)}Ub}-PJ4{;\/2~=O.>:-AլZ=Qp 63]_a$' QR{~#Oj>^vG'wK:N'aI:P'M:N N) F$++sFy뫼9qXA[Q哰G{UAٗ;8tӠ-| S= ^#륇?j2%2ۺ0%$r $󨳡$9ctBK)8xp<[&ҭ|DJqWf$E~yyN-yq>*E` r _*lDMe "5 npq^i%[ -a#'UHD\>QFnɆwܻ[+k8[%:[}2HsO(Dr2~,|䗹jR)*z /rCQMm'_5}WʼnƏi,IP~2z;vWA}ו Dk\rhďJʪR\@,YW}]Q4Mi|UA46Xb16(PJa 4#RWK8yͥ)Pzhc4c@A쓉ނƱh{ j=!꺐z!-&jQ7RSeuL DͻjxDzTR JcZB,NK+q*\ZlT( JPX3B11*EH\oVf,Lyqg(T٦>')}}R 7xs1R4NJ|j`IJ_rV&݉_C-ryg_,~lv"nnvX?>S_ ؉-P^/uA ;|g Q?Š }{(GE-jOt"K}#;|ic$c6][_\W[./ϫ-yy[;ŗΨm-yCw(mk`k$}QT߽w]s@^Pב{]w&aǷuCwANHvv.B<ąfÕo%~[%* I&ۨչ/5^'ISӔ^WJt$Y9HcxUyolb)7ēH<+E8CJT1.$D3jϵKnhVf;_|CJ+k[cH}Fڪ)^0P3HZ 4LEk*T.[fP,,ǼKFѬz 6h"Ah}ksQX&؀)!9ݒZ'PeSEt{)ݙB?ūfr@fF'D>TVH Y%E^k*H1p:ʌ6p6+w^QLjuv^-y/cު1ጰ=WcR%-+\yw^Wc,޻XF_Fʾ1'Z{aEUNFiȕcQ~@MPNm*_wJQ1\p@+[!& : ixH R諞PKl^H=A} 50v:a5K(_\Ԗw.׹u:tƋg Ix X]UnZѰf/ؠB rri?!Ekf!&HG43k/tl5?nPk)O2Af!o8h/?fsx6q!!b9V"s&U"!hVp]B7;I6%BC* w3J^JSOU68h9\38}n*+D ~Zt&oIW7p]vfTTv1T$ RX%![ ܕ2j=㫛8u(ON$v#6I\6 ٟ] ڝQ66EHMU?3Ccw;oCWP-5!4@kk"clb}#3D$nM C<(G GFA?-#= I9uFu }"TɣE!E㊗7 Q`EU(9@d$* Z̦qj:\>1+Y+t^f:/OyC(u"qa!MJuփRTHZ3ӇT ƈ\0%`W9l7rWTجʑ˅gkm+G.x"&X,h403O[v$ҽ>EٖGGFǀ!Y *+X\[j!4WŚ2X)$Q|niࡦ7 с~2uu\tHVfdC!' w;u`Yk8ٻɯѩZͅ;疻c'gle~O-ZrU(vJyjވ}' }Y*~T17UҢX*4DCC{KjS6δff]D~|nO eD.=KՈw6DNޫ;5nR-bJgj}BT(TF##@S[.ٲ&nW9Di9xgc]S˛(\wE뼺\o~YLW{ W2='~bͫHtZZLcn:X+Z83ًm'C.LrND 3c0߯n@Fa bK ZNk@gcy o@ggi@{g\N;64L*?BJ!Ji+tͲx;C@fhL(Ӏj.1 Xb@vT,W^52;,ĜJ:/ؾY-ZD9U.^w46ג ol5G0o?IW7Ktr5juBZ&!V' LRjbtL3zI*o^h] ~ Å+:`wFng|ZvG'r¡\Q:Z"U3xml,/[- {ܘiyhX0xq>ܸ IJJF>rWiw1GwڃT|iTQQ·M}/bs{tԏ}6-v `/`Ss{Y4B ["ȳv:/Z팋5:eE!C+nej.PQwքp h+.SSڰ{Ͼ `RŐ@)U`hW{fQ9GQ=us<@4M}j˞CE<}%z#5U+dĪJ- \j%w8L}v:MOOz aj3Љ1nj&3i_.L8DY wz%cpqI05Xzx$>1YUr qMKs2_iatCg0/ ^Je"e$m+NMx@ۚ``?z<=j2{^`~7VvN%ς=˯SQ,'\\y*/o ANdfoj6N|1E%&sjGމzbyKާ=,.zV݅Gj ZUCkٓ8Cm0>'f*5x @Bz?[_g=}(E OÅ4Q$;2r4> 2'і^1B]g#uSk0 k`pb_,lj>ib'kio4멷\vµjw@c.6?';wڸaM7nxy q?ZA*q;2 LeĄÁzeAkeѺCRַ à-r߯|ǛIt|#q]Ѡ~utUkKeb Ͽj$N cE:% E7{Ӄw7SU]Ԫ%wVKTQ{e$?( ר[rرz>}1SmzR7uEyODbU/ĭ8< Ro읍Lp3uQx~ *t@;&d̍ZDʶp1{F7eo* Ӌ]魯}< -%UZ.SHބWvE2w/d2D rkbU ׶2{|-4(cKg~J%F[͍Ć)^ _b!ALIo_5y~c&n{~>1^yŸ}† 0(}C{F__~?qWO#woUoo$'*'p ё15H %솫Ae95{_ u E(mZr5,d!` a[k#~qL"zBgq<%*C]f1ͮWg/7!X')~k]|w10O}QQ ] lbYñe ojbwE̫]%%ܗ$J]fy݊8'Ba`jpC3QB8ȧ*D ]uzWՉk%?C-mC` ࡁ9[ PX%-6InG?2CΞ?q]܀32eus{ $ T/BJx5%Pϰ7ٷĉHb T cdP NK*R8av;K3J,7|Jz(bU#q1ĚsWWUVT"n\dk/ך6M뉸\eǐk~ ]p[U_|䠨| 0G׬jDM1+v)KVx|azJ]1k7ɳGh>x Yi;*`[v>mf5MZd虭ߛ\c Q?e j.nCͭ+0ʔQgLyq*!OX7|}>C7H3%-D3^C]#\?z JIdn!U)By棣~vf$zs,ˇu5^f'aM}@G_.jp0hV&o?-E1_(:yf@g-L-XwTbL% !ZK%R$6% Zn-U}Y(a:[>6ᤛ!\٩6'9څjqL\iܑ %dG?2dQ:tMh٪2;\b|Q,_̾l8^ ] bk 6 MݧJH[JfS2EEC,/!`wa-F%[xM_ozӹ /> l% bAƄ`ц "5G!wSsƔg/;sz@W/.M/:BVNCaޟ^ qķ/I: A} =@Ѓ2d8&iPiNUjѩF(o⇌:Jiī/>exV\ߜWrRv5 K@ͯ@z!Fo*7kpһJ؅}%A^{ /Ңz7^hm¶眊0R+6:oc]w[ւT\o~y%j OKέצ0nRI epvR&Ʈ}9cF>]jvk H ˃Fd-oH~[:&㮤ujm +V~7lEyUR&:' `ebos.pA=f2|R[lo؇n2$ڋw|鑦"}T\sa\g`-7lUb!#,j R!;WƳxoXdFnM7ωR7qhΔޞK8/ÝW m]ZR[]} VW߯# 7Uw'a-kkw,[cB8P^hCq\jL^{R ȉ,|;elk?{? }]/;(>ޢߒ9>Y -C `Y~qPY1&;z?qSFɇ:LFE S|?ݹ(oCbJbj0ZU$oDwn?Qm\T S9"F#i=!w "7T%րǦ7b\95,uOK? 3(+ LG5 szap3\r V! xrsY$f`/r= 6LO]/\L ˾7DKD: _ݟ'3lHls&o7yE41 [ i^,SBN8jRrwJ""J >( @2,ZuڠӱMJajmX6˄U Ib5V!qJPl?yeы|+4{9Ib[.O鍊߹GA'/= QHcK=0DB3( #Jsi'JJ\Koɽ.@e DCYkK'9Dzю dpˎbyY@s%J{:1ҴW$ZQ2e 5) 2aY*^eIEqݛxGuG1ڬw-Jxi#{roRhS!_SNܤBɜ,I\&ʥy ~{<$T7N`7>u #4ڔRd2^Hq㡌v"Y_PHm~A1`N-}$rB0&ȷS"^)Vg2RKCMr.T5YJ"$GXbUƱ<9X]ҥڊvI#QF3mS_f G #"TMRIsq.鶦S&Y@E]wZמ;Wj,Otn\KFR%4 ìL(Eύ<%OvV# 8Mm jJ-g\xRI#wWRP*iVjr^";rAO2~@Wԙ;JOW0|v3w]]|2uΑ  ^";PNr\(ON7K2R3.ɷ[ATR2GF2Is I.bA4HH%t-CdabX{7 Ƕ-ҍ1W\)eu5XB1a5 DŽp84:W3L6K&ýÔ-f{)EcB10=sZ!J: n4*EN3PC+* _ia7woOLN]8!s ;%BN>_<*R[v0ՠar\^~U)ޫ5š̃Xx,3nq?_:E=jq<. 9|9ƣ|xSr Eh!ݠC ! .pZ$Y~ox4&5!n&F;ID2)F÷G8Wk~bua?(wTZt_h x:Cpg\8qXӗ liVj&ۊ~BHC^LC/S)xԥ̕2ud),p)TghW,M5Xʹ72ӉR̃6&xbmf4u{AuM\N1FaʼVۄ2SuAq/(hȌ"oE*]20:Yܰ4*uxJ6=>uTZa#@WBb#RQzV)g\/"a y;6\/ 7kd5N\RĦ^瑢qڢp6SI,jW;BJ+@P&Z5m|I}uuB + 8 ZQ= RISfaut!nS}_T`z,/1k]#mv'^{j;Zzxg JՐCc%?9l?1+[[]aat0nF#*!8S4ӥJ dnn>':5[cpn| VaLZN6u+enwB*HP~TthVg8PrscD(FV͕(h$8 .($"Q`K<Ϝ1-l-\Pxv^<;[KG׾ܿt4RE\,븞#gP0yL%]M1zjxg#x,g_Iz)K# K-SS;*Z -#qm}ֶxsxD҆:]A*TcCT0n9#5v7[)Z ?jAU`iMARatF)SOplr Tsn4qe"-TL굽+5jKI v?* 0-z_aVRn[`U$'s?j0W!o6Gf-3 ^/yeԍ,dr~2q$X)h+os_/Zp oΏz~Z6tv4bj(<#.,EM;O:*\_ x{׶{sJP.\Dk_7p ڭ.bDnU쮀i)ΪmV?m}vkBBs%Sp}Lv˃ѩGv])ЊnMHw.2#eJZ ŝ5?@®2Iܖӡ:t"ϧ4&uK-G_ 6!, ٩)Bp8--mOm#ɷ} k:"?ɮp |g ` B|.< v?3Dxg-@fdV7Pd(pJ-ݯJp_hpU2.0(JC[`$,,(ڼQQ^tMPJVńZ.t8S$; OQ rJA1Qcesz * l6dRK'ٸ9^˅f7/p,4/5IuR0w+]ֶEӼ;g˥' s $_p(vVnX 6UB"/+x@^x,/@J\C%^>8(0L`yrClFZ] f*hO Gw_Q (y=Jd"y*AHbhj ]8kmXE 2'ƾg}J@R,[ )YvM2(3H" 9v}U]]SU_&dp±4[sr!w}T?^B^q7rMٹ .BG|驟JjDꔕ]6̭nr>u.yzmsKfڤR(4oj)T4Ln>iI9 3g RޓŸ|e 4jڧG&j.B7CR1&ed5,1jrXX^f@UU-y;!c}eL'D$QyDWw:oEh j˨)DzkZʂdaFj-Mo||K !iw|W ȱ8#z%[1o,\ D7'pD `db^a-`Fj2dEv,uYLf>~dzt%@;b(G<## p"R)k䒣f`4N1wM%睲$xd4; OF }$ڳۖ!{c7 *-4l`|Y|!+w*Rqe^4~Y#B{m~DIV$[V{$ 7{ʽUZ}G*P]_}s5IF|Xus/:mŠqi{w>THNWNCX7^6qWMsޭ/57xEETn3K*MMG[j/XRzbPEA7:p!Po%w)P-KMtɦ*<q,۞=I,"OQǺl/Qx )Gqx+sq%i*8Vn>ETsGh:Hİ(- 2԰I[4>BIAPkK /cb1,R[ F%(D9<'>+#d%a*X^:H bryĨ wkaoDX͘f 9/!ߊnM-qo}/ .6.]1^)?Vk%h|krEMV7Z0ٝZVTb z}rvs|#&,LamqɊl?EYrte%wQlz+ܝ$^|6|>NϿܲ"(%;G@5j!Yg4Wݻ7D?& aLC&0AXt)1 u :xGWru5(M, F8\RCaа WVni3b}#lN2խ+~GGc'@DCRFA X=N N܌o<` [r6gqQ7{~}G.#[:zNEBr>aRHC84{)aފHc=VE۫}UlUmiBSk!sȍ#)mtqW~FͪЬAC%ͮ`aSTOkW#C0squ?tŵDa BIy? D8 :Yt{d\oy~W\uC_1AzwR>16(684Xꄫf+j3b"Z*5]uAcaip?bP7:QndlʨTFX7>6Ejo'?nлŠtFud cޭsJ6q])))2Jy?O.UmV.Qe&FnD,ǭk"ON˂ȚJBH@ѤIzHKU<|.iiut[Y I hSe]- ແ P!̯I}0d%;~<׀{^U켇U8_PNw!*D3p_X;Χt)$-RRg9\;}:L_~}7/]_ͧۯɟε_ߔSےןt?ٍݕ\-?6bO{[v`|k= ZI[o+ Cꪮ-ꌨ5L.1|+\X'k;B(_JewNĽٔvǫ{I `9]Z\p6wV+5%νg"悵ʖ'o'mO8Ͽ_٥o{\!2&z2 k 4M(Q8e?]HօXV3]ee? B)P%up+W5pW w[҃hud݁BzB>֊jV2z i5ߍia n9Ks(ù*]_ɥG"plGӏCK-YGg+?^,j.N_PgTw&s:Xgl"ŠU#qS(JߙWmټ؞N͊ѡZFQJT&ݸnw+uנ2U ǕjsTaeؔn:0+]68-GyduAzx녨^Wmba?tZ{Ip`Lz^?^Zks* Zz|ȹ@E.IQXtBӚŀS@Ř:Z9ңz<kaԨlc ve^F^"1|I@*V)Z6 MrR:}! 的TzH68/ _OmoGi3BCٽ/i^\ڳۯOS}?gKTT#"m^ϳA٠IE`E3H_=ko6Ř/܂mY|~C]짅A=p}Kj{mV' [MՋzJe@%!PN!jc!$RWϸm hR:L>XF|EC;н+[HmQQ*Sqd!W5*U Mr5VX\szl`Rڒn/;oMGuПzwc>ĶOto?=.zmq~oz|mP,ox_CS7J*+]/N߰n`|jk}=O|/f૷}sr;F]`Sg뙅ͨQ$zsr@tuk5[53shI9\xG[A}Ѕ! $V0ZP]XFB] T8ꤕփ,Qmph]^BxhzQxjb F"ۦnj PzO0bm"=iӸ`c [-5Ao 6$76Gu3fv`qy/mYK=bCC'L͑$dM!]wdՍͱx'vIM=x{rZ[/?:'95z2O#Gc0~qB)禡kw&O^g{,ܓ=,Fʪ ;7ez 6v:' QЂ.1Cv+xv!2n˩CҽhjxYD8/r^R1lkU꒳gPE_4(iz>}|hxڱ vݗ Tؗ s=k7"56%3}/F馎 ,ncHȟ\Dɔq8i7;Wc݆ҠDtvۧaPNg SƐ?n-SFg*_TºJȇԳ(X~ 곧P`0Ri,Ը2)mfrCRx+tiBe9ݲ}|e*e8 +DKQyO5QJ 6JaնIr٨U6 S6 N)bV5h]O?rX>@M5ԺҘ {S~TƘ 5&Xi TCS AYi| 5BMFPF@ 3ɩ;h=t*ڄf!/A}j=JAK.<sХˤڢG)=h)->KїIi 5RzRZ_7-^,;|}wg_~BTmu^1iQxҰw4 'KPKGKΆ%Ge ) aЍ8WbϪN{,8M_/SBc_'?6W-OM奋 e'7u/ۇâdp2MwV}65:BK]ؗ/kiC|ӯNUBsdgw uO*ʟ/oqh;IRnG]//4D#d`~]ыZ)͎{wC'pYڷQjȣlWeK䰤.Gh V`Ԑ R\id}4] *R̫e{JwAĜwtQ*jBIg*t322[Ɇ,@":={KBJ#ME 2k2` YH9J\VeT% C@r Y5B8e;ϕ6Bnk7ωo?ϬmvgJ/!Oz{У7^dZ(Hy53 DyT/_)FwZ󗇗d>B)U h,`OUC'vn\Ne>E`{UTߦ>'"ON|_WpB/wzz3rZUB&Q LU$j>8r2#K~֎[WX[74d&~|~oEdLکE)Q"GD~QWil:zWH(7@~gcvJg,\6rî;%[y+OZh6.N)*Rf:o?2ay]&?P^\ ]L)c@ȡEv|05`T0FUtVU4&gkNsϮo6W= }ۓ(F,938[79hWEy5zXKC =dB#p*4&fPUU*DD%9gdr諘@i ˡ6¯ !z96_kMg~]2SFJZa\7NdZ5CS&H`seϞBm9^;eRB,T2)m6(-X6%BJ0ڙlBH)5"CzU%նc{Xh4YKi0mp6}Ӻ}`k.a%gAM6Eoٲk0 HD eӓz0F؋ZVO\ܙwn;ds)[]H vA;+ VLPYܸܶw Gӭ_6}O!Npb&t" oˍ1~S%q2\+ v> /*G"W%.8 2~Jiq j%e5]'W&Z[%I6U^t-^Q=Ws̽% M~# uǟLm]nG`>ˍXGJ\ZRlr**繈7d޹J˂ccSBϧlq3 -qNG1[n׍ÎZ.P[u17 [0U jE] 35ڪ:jv4\"E*h6ېP>M4ZF-ep\u5VC(1YI*ZQc'֪ | ʻsv)hCbǬNsHj/6Sr{e%iC9=`F#N +z;&/A} j 869&ы ?֒LV˴o!24J-9yJ1͊*UʬLJlSTc~J^r/uuo*Uחŕ*˧'`T҃ڒzJIŝ;C2#WTK%+lBqиN Z,z1ˤ hR LJPwmRzRLJ5!S5KKi.~V`]^k-*qۄw 'Ij%^[Z-lz^mh16FWЗsgy-IL 1N(cj}]T.cQ 1bt?_owlˋ:dc;3:*%RZLE 4oE}!hrzG$Xi_?5aBE5":O2QtJfR!P_+RoM}77xOwt{+MXѹP /u杀H1Zx(e4xte'*˛8du=Pi>|"3(*Nw(֌m;t;YvO^Mr#O;'oPd4zl-BdDGg jh 6eJ*kiiٰ湅$NpjO7#g*VUYP"* 4hgʇb"9#S^&U uJeTjHv@8}afm-߮2/"cUHf* oVϖ II梧 F+Uz)+a}U@4HTB~Cƶ٧[+7NWGswucF±/Qx?lC}Rx R PfW+"үFu2ՐA Y[YLtlLʝ]{(}ыWwG߅SvgHZyVS@h`Α?7Aʎl/W! )ju~*M@bɲT³+BCi4ʹqP$G6]:+WM5%#=y܁͒ʒa1>(6\QCBB,a^\ X˘]Gk4\󷚕Vu=Px_o/~zɓ3<èX6iT)A2ç#L$|.Ne=7*^H%*LlnJiep>u{ -;s@zb TZ!H|iFF7ޡؼˑƏoyTdqsϲ񪜌#W,W'te(*^P4 mIb TٻrW/ItdIx;)Hxx艅&yjq{,'`v6{+S6\-j.\{#GV#lǰq[6C7?'4c1 mGdu$DR2d.@ߞm`K/fABdz /*X}c7ƃ/_FIƭ?;@009 Sc.NJ LӅv6rhؚ͇Ux|io1x8YR\xiS_$='<ƴi&Ƶ*1NLF;ȹn^YiT!}SO _@?u |{ӡ>̨^L|$`/^0^jrRpS$e*Me3"C#ԇ+u {^4ZZHf'T:}}h)֛(Mz S 8r0BlF~SKn͈7yhXXwrqSEW'p2s7u%R)8~6~փiB8y3EG3.׷h}i WY`:;]>2nڌyqoE21mcݯ͛ DĀXW8#VJpoA cy.gf=4DE+6И>j?w:헂>=A1gq1|K"u+S_ٓM[jS+=f@ؓ8· b>YlBםcm` 6|Ls&1v?gOZ=ŗ=ft3kgz+؃J*9.)baAoO+ޕ˷?re3@ AFR:q8ٵ{+ I`q6lNdvw[jdڐ'l:9fr֞,'{ Ӂ}P-zh5o^qX =YdaJU.{,&vSV{ `ǐ0 BB5T6ABM|0eB;m]`U{p黁*E%BT. \jRǾcwKi39+,W=a 龜!|o0N1ms%%?1;w,) 7:^]v!f>Lh,`ZM `Z6)2n:Axf5k<|i.{sbI3xy&$(fONv#YNt,c!RM,Ln(gM7FV !I4ط\5Ze~5 !hd$:er앚/:?0E܂hZ)+X{a$%yta8™E3ꔤD?ןg> տj[ >$0L~|\j8.ݬ{ssɥl#JCIq.'AL<є@E%)F} >7R]&9@NȄ1VW;U#`ϔS~埒 Hp{wu8A`Q្ ZPVWSwY$$i[d(uyy>Av8R,`EC*4Wxo!1n‡lKчȪ)}aCX?,dR< hCRj#$4z7}'-5<7C͇P0*r0&]b Taz`|JB`* NtAs*zF§32'Nނ_ED:,"e1ĚsÞ_ԯdp@1qL6T=F?osc&\[59½߼pd9~n$xwDo럟؇?a_ 57)PUQxCO;eSMs&Tx-˫O؜w 3E*joZ}5;MR BON]T 2g!@@XonW}]|3+Tp4}w?A,m3|/wO1۔Kt(bNv5f)ac)W܅7CFb39bpQE{jmޝCX |YPEݓ`! .ЊࢮY;㱣!{<=$^WMr_2Ë@J,c A5R$ǧ.~AڔXa:ȷ*cz@rq3bϞ#gH|[giPUv?rPJI afz~IKofB 0p83 '(b,z4|^"}K׉fAxK}r`pce&ST1z="W/ҒnVNMQ sg^'un3 ߀qI4閭ӂAOETM%3ʾo#xL|ۀa-af_bŀԑu7&;:k9Dž~_Wp7"y0jۙ}@$2GuFWvMڟACjMf-r׿,o럢P6gpyuE61 ez9_!)*CX-0hiz M2>"%xHYbtpwS*fe|GfdDT^ϔҎ^ VX2nQc@O"1:͉"1fe{c2-Ę;\2=ߚ+(Bu`Sq'dfEx"gf1 0teTKf>~)rw(()bȘZIof.|.2̶p/q1v޷~i k3ԛ؟v,aU^^vWdBEM  rQ.9/?{a}>/J4-ȩ|RE/[kl}r>w lt nNbhS;%cKa{IQUxc yd;/OyIUt-1y_2cpƢџqYsCJPjM>I.Y{x֗.ѽ? R17b=,Us0N Q 9Y+^bSb jRCMr_;'A~m0a)M AaS(#SBN),k8$2@芨p"A #w/0s$E+V z "YA(qF=@hl摧pc@ (DE.-kڄh @5.EiXy>ksصA b;Pp+ I0Rsn5y#H VSF pivTvv)^0AdD,R찦Zdܥ d [ Qd |lODV3ޭj#4ظj`.V’|9 <$i[A0nvޟ}Ғo2YƕU2l@s-,vgp)'xB~FUgeFMXo ź@q2}A EFq6$ XONs>L]a6Yyu+7u8M8|P2:ci|Y`<2NӵQxC!*({,ƿO&930QqFI{΄g+~77/KDJ5MpW-G|y 7.F۲Zqմh {+]54|sM4 Q=CP} i< ~da$j~li Ԅ>}(\<( [zjRR**ͺvzY+*Tӧ0ht_tJlT.Tʙtu]ϚDFutr[mߞ*ݜO6c2 xMo'D% g;YTƔn7UOOU] |r摠8Y#fT)-z;KP²-A%#Ǎ//|@}g{?/ryv֐p%S:]ӛMRVѩ*xW$iW![EL)D;qnh7Np'[] RD;hcFpޚv_"Q5!!/\Dd*e}ʸYN w_\m,QɂLHV au[mL(yt演Q)§-2lcSh<Li.Ps}i4h*Ր6_'ygk\5;r't7_m&;t,ϔ,SJOہh;>} AY$yU\k FAPEypyk(^D9C"Gj~(Xp<-{{s 謷`Gk_R͡򽥙J8܂{&^Lg\Ϊ4}䠘v$ӔǢAPV8BV 6e+3_R?<L2'Y, $c9uY.h}# k&Yk7T`.ވfMuo2P;:3{>c32nt%~)#޽? Pt=EKja ̌SV(18#Ԓ3D`a` RnUjN^nV,G;ER뀫nTskmjȘDbSE:Bd;/&$֦Xɏ vJ\p^}-}ѥ-ָn ׸m>8br*U>cwU@?|a y6PCg9ڬ1kK9OxIxƔc W/j(~H]`LBoʖ ^kTRj4bn6EjfI Nf\P C`cRJTZ%R0,!&4rC:PpGob4i2Ye #3D>(a%!:VovZPrgy;ɡȃ原*~{cL9"ϡ !\D_BhɒvcJAuAt6mwgִ[gBk[EL)f~q/[WԈNhvw~*69oZu&V!!\Dז)F3KZHQpu:#3$qDhfĐQpk#;i'^0l9ؓh?g"N'c5[&‹H<.F@JY]e $xUxu'Uռ*< À *aOp]j]  8h2D>D-0v`@HQz匈7e3/ 1P)lP_Zt<".0_LoȹIfG4dҼ{e L$E5q )@(302$b,{!+!CV%#c$b@%3YW4S>g,{?.TLQ!L$;gDho&JAzGr= *E /Ԫ#˺]%&w ~Ư~(SG3["CNs1L]UIP,-tVRZH֕Z` 仑en2Ȳ(={A7[dD%!7!/U:YD#ϾMD#OJĕHǬH8(|2a< "ZHB}.a9?3s&[aPȹz+*ězF0Ivp W# мrF<c[V `_itZ$C+"D> I4y,Wd՛R/Qߠ/l.*oN#<J8n 'd/rwr[+nR [x7e)TkLqA7C ͏p:$RBTC-BQlQO,W4[F'!MfD7}*UM[ 1V\'5ZX45trO]8*ͷ6}8߳/"N&G, ?q<^]O?wv|:^}89a4>43;:O /}78I) De-hiVٿǷ Xn%˞1+Pf!.eJcR)ɥz4 mW\6iba>*,sETN2s <uXp1PwiӋ{ 4%,Kmf;Tt^Ҫe@Cy\~a 'XῙJ,7s9kdsַ9kE:˥|FD~<ݡ9zIY/ ʋFsY"l?e f龍e:VcOBʡVI`|Rm:koH"P\^n ?ȣjjj@hTZm8 3֘u{%V,a+J"[Tp74p~fۈHNz+)-dK38U$`JN9)8ձ3&dO{CzfJz+Fgs ߨXh%y#o1֟@1 4@@ПmU|݊vhձ M^P<䙶2o'*=-W6^*jh0:\&Ԛz2A6{p_ǶڴBH'rH~L𵙀7r&=7:4-сtfMlk/җѴ~a ϸ[t޺ǗehfyS8g1tU^{l#quח*Lb@eõuen+D 4[Lm6bN`]Sa<)+ݺXC/XIwd 6&ߒRs:zv۹v+uY7T!PF]u3!Np Dh~Y4DJ=F``"O! 3( 4Q$Y4HH= 3pםd:hsOAt~x)z9:Kx; 2+(S%(I$eq¯"/P2LW̿;gi?ODw.#9L)RBUd.+A~I_Frz>|&}Ĭ&O糙;"a$z Yrc! YcQ-GX*ĖIXIlgW< +`bo`z`>[x'_[-׽(ƛEMqJ.^lh3= P$ّ+/r b*̈޹ycAg"v&|7t쏄]Aa>Na~|Ӳ1`S WK~zu=7ba?٫dKWH9'Ǐ:jlA/яbp?DZO?, g߽Cel*}Sȕig[3MEmR77'JxM)ubBV1"áQ'>6f6 j>mHj"#iPmE}K>Wu.#oqNfڴu-) J~z)Q_aֱ_ls ̆`;aoz)/L dw~ꜰ<=a(z`6R6ҾX%al@)4Sml̗RU#K UgRzżI&P:D%Oy, 򨢱̱' ![@_]fU1z,cA7Gi{M'm2HL^p9/J|J4Z?--2Dl%3?ZOhM7X־Ά7]XQo;8#x=䯡 nتp>YgEUG(z\u/TWD`3%`Cȭ-O㟃md!(P۫]VWΟ:G%o;D"\USߢ=34̌r° to9Yq&j`' If}Ƕa(L50.,uq9p./>4Y ?uqZ㻨$O 7ruzADJF[0_GDȁ#W4!#11O0=__UPf6 5e2H-7ӮJUJ)+n¡ ,h6 h6;(1i) %* dV]5blH?l۴%4SiTTb1KB˃VLfL3}#К$DcyEcbT{#Y8#?>Ă+fw^?Pi9a83aaq4|vu;>\ 3vm5~l'ER@e?Hɭܕ,>쎳W rg(,Y#x·7V=՘4m~uY[0o~Zsu:(˪ `C)%EL %2w,$d{Y+`)pM'F%餙,e~1s}οS(]LWCLz4auſ--W.gEM eom`1L=ހҁMIQ,lɢI޵ b:Ҁ 0֞qbmRa~$z؎.6vd)>""!CɁ%yL(Fmoď^'$"۸ K88)Ne "X N26JG'1tbNa}# !lh5s_rQgGk7n÷mʻɺ%| fk'b$C2\"ATQzKO/zOxD"WfOp3XJ|[suѫ$.v^?Fp"*hڴ<1E" G^u<+Xy y Os1q K-[;(XTbӱ^Uٺ>sD93Oc%Ժv<!X%6w#xvIkmGI}qڼWL 3(4*.sDI*;jݮU H.wU${<̈́Z*pgg!QP%?UY| HlSa(-Ō>5\qwR'88'*I8\' 3B"?Ծ;2Y2Ŕ*tǻd Bv~MP7qI/@v~`%+"TK:TějFЏ%8]sJ@S(VFoJ5a `a~f)-iH'b 9kЈKvͮէt-S?'H TA|mS~ԃO2 \7Vdʙlv'nw06l^h\s e֐s )۷ufsF }u;wej l,[U4HbF ֍Xgn192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.446706 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38292->192.168.126.11:17697: read: connection reset by peer" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.499148 4873 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.499210 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.504361 4873 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.504421 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.585514 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.586931 4873 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="70fd060754ce5d2cd66259d212c3ff5be0347c059a51e1f6b9bda74e1ed507e7" exitCode=255 Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.586970 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"70fd060754ce5d2cd66259d212c3ff5be0347c059a51e1f6b9bda74e1ed507e7"} Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.587147 4873 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.588061 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.588088 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.588134 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.588543 4873 scope.go:117] "RemoveContainer" containerID="70fd060754ce5d2cd66259d212c3ff5be0347c059a51e1f6b9bda74e1ed507e7" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.740645 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.740823 4873 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.743567 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.743597 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.743609 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:05 crc kubenswrapper[4873]: I0219 09:45:05.774817 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.424126 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 03:18:44.467177582 +0000 UTC Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.591508 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.593473 4873 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.593425 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90"} Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.593810 4873 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.594500 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.594533 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.594546 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.595206 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.595252 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.595266 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.608914 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.666160 4873 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]log ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]etcd ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/generic-apiserver-start-informers ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/priority-and-fairness-filter ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/start-apiextensions-informers ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/start-apiextensions-controllers ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/crd-informer-synced ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/start-system-namespaces-controller ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 19 09:45:06 crc kubenswrapper[4873]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/bootstrap-controller ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/start-kube-aggregator-informers ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/apiservice-registration-controller ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/apiservice-discovery-controller ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]autoregister-completion ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/apiservice-openapi-controller ok Feb 19 09:45:06 crc kubenswrapper[4873]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 19 09:45:06 crc kubenswrapper[4873]: livez check failed Feb 19 09:45:06 crc kubenswrapper[4873]: I0219 09:45:06.666226 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:45:07 crc kubenswrapper[4873]: I0219 09:45:07.425095 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 09:33:10.128661185 +0000 UTC Feb 19 09:45:07 crc kubenswrapper[4873]: I0219 09:45:07.595540 4873 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 09:45:07 crc kubenswrapper[4873]: I0219 09:45:07.596519 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:07 crc kubenswrapper[4873]: I0219 09:45:07.596549 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:07 crc kubenswrapper[4873]: I0219 09:45:07.596561 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:07 crc kubenswrapper[4873]: I0219 09:45:07.877231 4873 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 09:45:07 crc kubenswrapper[4873]: I0219 09:45:07.877294 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 09:45:08 crc kubenswrapper[4873]: I0219 09:45:08.006035 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 09:45:08 crc kubenswrapper[4873]: I0219 09:45:08.006303 4873 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 09:45:08 crc kubenswrapper[4873]: I0219 09:45:08.008810 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:08 crc kubenswrapper[4873]: I0219 09:45:08.008863 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:08 crc kubenswrapper[4873]: I0219 09:45:08.008881 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:08 crc kubenswrapper[4873]: I0219 09:45:08.037371 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:45:08 crc kubenswrapper[4873]: I0219 09:45:08.037605 4873 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 09:45:08 crc kubenswrapper[4873]: I0219 09:45:08.039264 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:08 crc kubenswrapper[4873]: I0219 09:45:08.039320 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:08 crc kubenswrapper[4873]: I0219 09:45:08.039344 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:08 crc kubenswrapper[4873]: I0219 09:45:08.425901 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 06:18:33.543596538 +0000 UTC Feb 19 09:45:09 crc kubenswrapper[4873]: I0219 09:45:09.426581 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 14:38:07.226503819 +0000 UTC Feb 19 09:45:10 crc kubenswrapper[4873]: I0219 09:45:10.427435 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 18:08:06.647487542 +0000 UTC Feb 19 09:45:10 crc kubenswrapper[4873]: E0219 09:45:10.481555 4873 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 19 09:45:10 crc kubenswrapper[4873]: I0219 09:45:10.485294 4873 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 09:45:10 crc kubenswrapper[4873]: I0219 09:45:10.485369 4873 trace.go:236] Trace[1568776502]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 09:44:55.698) (total time: 14787ms): Feb 19 09:45:10 crc kubenswrapper[4873]: Trace[1568776502]: ---"Objects listed" error: 14787ms (09:45:10.485) Feb 19 09:45:10 crc kubenswrapper[4873]: Trace[1568776502]: [14.787065808s] [14.787065808s] END Feb 19 09:45:10 crc kubenswrapper[4873]: I0219 09:45:10.485412 4873 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 09:45:10 crc kubenswrapper[4873]: E0219 09:45:10.486173 4873 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 19 09:45:10 crc kubenswrapper[4873]: I0219 09:45:10.487458 4873 trace.go:236] Trace[886226136]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 09:44:58.170) (total time: 12316ms): Feb 19 09:45:10 crc kubenswrapper[4873]: Trace[886226136]: ---"Objects listed" error: 12316ms (09:45:10.487) Feb 19 09:45:10 crc kubenswrapper[4873]: Trace[886226136]: [12.316509046s] [12.316509046s] END Feb 19 09:45:10 crc kubenswrapper[4873]: I0219 09:45:10.487538 4873 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 09:45:10 crc kubenswrapper[4873]: I0219 09:45:10.488015 4873 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 19 09:45:10 crc kubenswrapper[4873]: I0219 09:45:10.489532 4873 trace.go:236] Trace[91785162]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 09:44:56.063) (total time: 14425ms): Feb 19 09:45:10 crc kubenswrapper[4873]: Trace[91785162]: ---"Objects listed" error: 14425ms (09:45:10.489) Feb 19 09:45:10 crc kubenswrapper[4873]: Trace[91785162]: [14.42598717s] [14.42598717s] END Feb 19 09:45:10 crc kubenswrapper[4873]: I0219 09:45:10.489578 4873 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 09:45:10 crc kubenswrapper[4873]: I0219 09:45:10.518552 4873 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.362332 4873 apiserver.go:52] "Watching apiserver" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.368909 4873 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.369623 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.370588 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.370906 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.370918 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.372002 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.371378 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.371843 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.371877 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.373245 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.371284 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.384088 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.384248 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.384504 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.384874 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.385009 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.385302 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.385584 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.385765 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.390431 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.411066 4873 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.428203 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 19:26:41.748402162 +0000 UTC Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.433928 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.450859 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.468063 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.482944 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.494652 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.495008 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.495254 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.495454 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.495679 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.495946 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.496247 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.496473 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.496693 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.496882 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.497031 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.497289 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.497503 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.497683 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.497865 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.498020 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.498268 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.498456 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.495480 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.495506 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.495682 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.495887 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.496145 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.496712 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.496888 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.497039 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.497232 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.497542 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.497700 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.497862 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.498308 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.498466 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.498625 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.498949 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.499507 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.499594 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.499993 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.498647 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.500528 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.500675 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.500851 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.500984 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501009 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501131 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501189 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501331 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501342 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501377 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501421 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501456 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501498 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501586 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501624 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501669 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501705 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501742 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501783 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501826 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501868 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501870 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501909 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501947 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.501981 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502018 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502051 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502084 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502162 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502216 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502260 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502293 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502325 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502359 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502394 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502429 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502461 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502533 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502569 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502602 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502635 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502670 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502708 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502741 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502774 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502810 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502847 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502882 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502916 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502953 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502988 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503021 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503055 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503136 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503174 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503208 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503240 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503281 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503317 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503480 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503521 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503554 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503589 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503623 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502163 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503683 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503721 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503757 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503791 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503825 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503860 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503896 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503932 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503966 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504005 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504078 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504152 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504202 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504260 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504299 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504332 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504364 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504402 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504438 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504473 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504507 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504591 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504624 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504658 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504693 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504735 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504769 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504807 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504841 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504875 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504909 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504941 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504974 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505010 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505044 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505079 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505170 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505206 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505239 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505274 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505313 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505355 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505392 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505431 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505481 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505517 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505552 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505635 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505671 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505712 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505748 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505784 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505820 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505857 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505893 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505930 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505967 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506006 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506041 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506075 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506143 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506200 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506256 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506292 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506327 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506364 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506398 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506438 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506474 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506508 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506545 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506585 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506626 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506660 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506696 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506732 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506770 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506807 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506843 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506878 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506914 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506960 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506995 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507030 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507068 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507133 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507171 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507211 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507250 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507287 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507325 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507920 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507958 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507999 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508060 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508100 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508162 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508202 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508240 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508276 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508314 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508354 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508392 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508428 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508464 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508501 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508541 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508578 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508617 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508654 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508689 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508724 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508760 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508801 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508838 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508877 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508946 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508996 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509046 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509086 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509289 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509349 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509388 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509432 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509476 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509516 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509557 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509597 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509636 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509678 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509786 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509812 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509836 4873 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509859 4873 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509891 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509914 4873 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509937 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509960 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509982 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510004 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510029 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510051 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510074 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510095 4873 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510157 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510186 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510210 4873 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510231 4873 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510256 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510278 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510298 4873 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510321 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510344 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510365 4873 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510537 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.512423 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502169 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502510 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502522 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502721 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502783 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.502880 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503077 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503212 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503239 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503427 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503600 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.524766 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503663 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503826 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.503977 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504216 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504391 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504416 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504715 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.504724 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505001 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505293 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.505736 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506118 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506403 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506678 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.506739 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507288 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507380 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.525517 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.508272 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509029 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509577 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509753 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509850 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.509878 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510121 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510212 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510310 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510307 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510420 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510486 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.510831 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.511012 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.511122 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.511556 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.511596 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.511664 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.525975 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.516092 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.516360 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.517414 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.517944 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.518026 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.518074 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.518332 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.518350 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.519185 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.519322 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.519670 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.519772 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.519869 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:45:12.019829647 +0000 UTC m=+21.309261325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.519943 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.520255 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.520439 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.520419 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.520546 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.520766 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.520799 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.520826 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.521012 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.521122 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.521428 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.521534 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.521588 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.521824 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.522130 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.522213 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.522525 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.522563 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.522599 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.522895 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.522945 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.523029 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.523448 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.523978 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.523976 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.524209 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.524567 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.524842 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.525286 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.507689 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.525773 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.525774 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.526657 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.528616 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.528948 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.529075 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.529732 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.530280 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.530300 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.530572 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.530978 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.531056 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.531252 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.531678 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.532274 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.532333 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.533899 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.534266 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.534634 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.534837 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.534919 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.535196 4873 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.535202 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.535197 4873 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.535296 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.535415 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.535692 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.535754 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.535909 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.536624 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.536650 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.536724 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.536396 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.537067 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.537086 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.537284 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.536793 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.537482 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:12.037441824 +0000 UTC m=+21.326873502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.537486 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.537520 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:12.037502355 +0000 UTC m=+21.326934033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.537665 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.537672 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.537722 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.537717 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.538085 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.538180 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.538666 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.538735 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.538726 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.538783 4873 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.539079 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.539403 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.544824 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.546576 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.546599 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.546664 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.547597 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.549248 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.549898 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.549961 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.550041 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.550906 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.551000 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.551383 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.554765 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.554878 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.555898 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.556215 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.556768 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.559361 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.559824 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.559937 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.560186 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.560696 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.563062 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.564389 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.564911 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.565187 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.565963 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.570187 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.575161 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.575195 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.575207 4873 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.575279 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:12.075260903 +0000 UTC m=+21.364692541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.575586 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.575613 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.575622 4873 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.575661 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:12.075648662 +0000 UTC m=+21.365080300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.579530 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.580776 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.584187 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.584207 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.584562 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.585706 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.588966 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.589416 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.589702 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.590293 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.601923 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.607447 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.607519 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.608428 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.608860 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.610830 4873 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90" exitCode=255 Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.610895 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90"} Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.610979 4873 scope.go:117] "RemoveContainer" containerID="70fd060754ce5d2cd66259d212c3ff5be0347c059a51e1f6b9bda74e1ed507e7" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.611585 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.611895 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.612423 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.613171 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.613371 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.613369 4873 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.613537 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.613592 4873 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.613668 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.614782 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621446 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.613700 4873 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621571 4873 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621585 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621597 4873 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621609 4873 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621618 4873 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621628 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621640 4873 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621653 4873 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621685 4873 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621700 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621715 4873 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621727 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621735 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621745 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621755 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621764 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621773 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621783 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621792 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621801 4873 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621811 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621820 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621831 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621843 4873 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621854 4873 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621867 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621879 4873 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621891 4873 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621904 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621913 4873 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621923 4873 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621934 4873 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621944 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621953 4873 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621963 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621972 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621981 4873 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621990 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.621999 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622007 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622018 4873 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622027 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622035 4873 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622045 4873 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622054 4873 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622063 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622072 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622081 4873 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622090 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622115 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622126 4873 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622138 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622201 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622214 4873 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622226 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622262 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622278 4873 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622290 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622329 4873 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622342 4873 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622352 4873 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622360 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622369 4873 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622380 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622389 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622573 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622587 4873 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622597 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622608 4873 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622617 4873 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622630 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622648 4873 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622658 4873 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622667 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622676 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622685 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622694 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622703 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622712 4873 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622721 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622742 4873 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622751 4873 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622760 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622770 4873 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622781 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622790 4873 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622799 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.622814 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623000 4873 scope.go:117] "RemoveContainer" containerID="7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623138 4873 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623171 4873 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623185 4873 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623198 4873 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623213 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623225 4873 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623234 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623231 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623244 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623314 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623327 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623339 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623350 4873 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623360 4873 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623370 4873 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623380 4873 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623399 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623411 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623422 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623433 4873 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623441 4873 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623451 4873 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623459 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623469 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623478 4873 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623487 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623496 4873 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623506 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623515 4873 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623524 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623532 4873 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623541 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623550 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623559 4873 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623568 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623577 4873 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623586 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623595 4873 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: E0219 09:45:11.623413 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623627 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623639 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623714 4873 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623725 4873 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623735 4873 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623745 4873 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623754 4873 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623763 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623773 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623783 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623792 4873 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623802 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623812 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623821 4873 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623831 4873 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623842 4873 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623854 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623864 4873 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623877 4873 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623887 4873 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623898 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623908 4873 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623918 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623929 4873 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623938 4873 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623947 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623956 4873 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623965 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623975 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623987 4873 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.623996 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.624006 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.624015 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.624025 4873 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.624037 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.624046 4873 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.624055 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.624065 4873 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.624141 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.633428 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.642469 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.651573 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.661725 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.664092 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.671544 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.680728 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.691882 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.703861 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.706047 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.718937 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fd060754ce5d2cd66259d212c3ff5be0347c059a51e1f6b9bda74e1ed507e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:05Z\\\",\\\"message\\\":\\\"W0219 09:44:54.615928 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 09:44:54.616259 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771494294 cert, and key in /tmp/serving-cert-40864076/serving-signer.crt, /tmp/serving-cert-40864076/serving-signer.key\\\\nI0219 09:44:54.929626 1 observer_polling.go:159] Starting file observer\\\\nW0219 09:44:54.937377 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 09:44:54.937554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:44:54.940417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-40864076/tls.crt::/tmp/serving-cert-40864076/tls.key\\\\\\\"\\\\nF0219 09:45:05.433822 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.721913 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.725477 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.725519 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.730902 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.736031 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fd060754ce5d2cd66259d212c3ff5be0347c059a51e1f6b9bda74e1ed507e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:05Z\\\",\\\"message\\\":\\\"W0219 09:44:54.615928 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 09:44:54.616259 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771494294 cert, and key in /tmp/serving-cert-40864076/serving-signer.crt, /tmp/serving-cert-40864076/serving-signer.key\\\\nI0219 09:44:54.929626 1 observer_polling.go:159] Starting file observer\\\\nW0219 09:44:54.937377 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 09:44:54.937554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:44:54.940417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-40864076/tls.crt::/tmp/serving-cert-40864076/tls.key\\\\\\\"\\\\nF0219 09:45:05.433822 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: W0219 09:45:11.747432 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-a2b37d751f9fecf06f003a04ebbc699170ec384c85835bd79e523ef75818a9ae WatchSource:0}: Error finding container a2b37d751f9fecf06f003a04ebbc699170ec384c85835bd79e523ef75818a9ae: Status 404 returned error can't find the container with id a2b37d751f9fecf06f003a04ebbc699170ec384c85835bd79e523ef75818a9ae Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.752565 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: W0219 09:45:11.759040 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-585f0209df108bb1488da3b7671de75e7f1820ba89e465e9ba6b44be7a60fd58 WatchSource:0}: Error finding container 585f0209df108bb1488da3b7671de75e7f1820ba89e465e9ba6b44be7a60fd58: Status 404 returned error can't find the container with id 585f0209df108bb1488da3b7671de75e7f1820ba89e465e9ba6b44be7a60fd58 Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.764643 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.777351 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.798013 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.812004 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:11 crc kubenswrapper[4873]: I0219 09:45:11.825009 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.030263 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.030438 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:45:13.030410462 +0000 UTC m=+22.319842110 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.131144 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.131199 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.131234 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.131261 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.131391 4873 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.131428 4873 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.131464 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.131497 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.131514 4873 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.131479 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:13.13145459 +0000 UTC m=+22.420886248 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.131594 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:13.131568583 +0000 UTC m=+22.421000241 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.131612 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:13.131603574 +0000 UTC m=+22.421035342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.131869 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.131944 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.131963 4873 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.132054 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:13.132035865 +0000 UTC m=+22.421467533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.429685 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 13:23:20.002372647 +0000 UTC Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.483739 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.483885 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.614775 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8"} Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.614826 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7af00045211f8e3e88ed8a53b838d190c4150f96840141a108e844fe99fc0c39"} Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.617203 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.620093 4873 scope.go:117] "RemoveContainer" containerID="7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90" Feb 19 09:45:12 crc kubenswrapper[4873]: E0219 09:45:12.620280 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.622330 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162"} Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.622375 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675"} Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.622390 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"585f0209df108bb1488da3b7671de75e7f1820ba89e465e9ba6b44be7a60fd58"} Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.623686 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a2b37d751f9fecf06f003a04ebbc699170ec384c85835bd79e523ef75818a9ae"} Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.624384 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.630542 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fd060754ce5d2cd66259d212c3ff5be0347c059a51e1f6b9bda74e1ed507e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:05Z\\\",\\\"message\\\":\\\"W0219 09:44:54.615928 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 09:44:54.616259 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771494294 cert, and key in /tmp/serving-cert-40864076/serving-signer.crt, /tmp/serving-cert-40864076/serving-signer.key\\\\nI0219 09:44:54.929626 1 observer_polling.go:159] Starting file observer\\\\nW0219 09:44:54.937377 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 09:44:54.937554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:44:54.940417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-40864076/tls.crt::/tmp/serving-cert-40864076/tls.key\\\\\\\"\\\\nF0219 09:45:05.433822 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.645769 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.656535 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.668889 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.687306 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:12Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.707251 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:12Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.727711 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:12Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.744143 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:12Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.764504 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:12Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.779920 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:12Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.802833 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:12Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.820900 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:12Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.840814 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:12Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:12 crc kubenswrapper[4873]: I0219 09:45:12.862560 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:12Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.040362 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.040629 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:45:15.04059287 +0000 UTC m=+24.330024538 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.141864 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.141939 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.141984 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.142072 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142080 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142245 4873 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142276 4873 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142323 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:15.142300235 +0000 UTC m=+24.431731883 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142357 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:15.142334666 +0000 UTC m=+24.431766344 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142433 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142478 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142503 4873 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142718 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:15.142594812 +0000 UTC m=+24.432026500 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142750 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142783 4873 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.142838 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:15.142823568 +0000 UTC m=+24.432255276 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.430132 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 20:28:00.367320706 +0000 UTC Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.483831 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.483872 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.484002 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.484139 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.488618 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.489559 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.490824 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.491776 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.492634 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.493384 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.494205 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.495005 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.497660 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.499015 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.500302 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.501847 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.502925 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.504067 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.505367 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.506610 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.508906 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.509346 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.509886 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.510460 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.510929 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.511490 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.512813 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.513462 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.514263 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.515818 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.517631 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.518862 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.520265 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.521463 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.522702 4873 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.522918 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.525665 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.526398 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.526809 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.527894 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.528528 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.529010 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.529675 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.530335 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.530827 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.531421 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.532056 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.532672 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.533797 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.534481 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.535584 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.536425 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.537272 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.537704 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.538203 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.539074 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.539632 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.540496 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 19 09:45:13 crc kubenswrapper[4873]: I0219 09:45:13.629720 4873 scope.go:117] "RemoveContainer" containerID="7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90" Feb 19 09:45:13 crc kubenswrapper[4873]: E0219 09:45:13.629869 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 19 09:45:14 crc kubenswrapper[4873]: I0219 09:45:14.322059 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:45:14 crc kubenswrapper[4873]: I0219 09:45:14.430301 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 04:35:12.029043828 +0000 UTC Feb 19 09:45:14 crc kubenswrapper[4873]: I0219 09:45:14.483250 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:14 crc kubenswrapper[4873]: E0219 09:45:14.483434 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:14 crc kubenswrapper[4873]: I0219 09:45:14.633792 4873 scope.go:117] "RemoveContainer" containerID="7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90" Feb 19 09:45:14 crc kubenswrapper[4873]: E0219 09:45:14.634061 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 19 09:45:14 crc kubenswrapper[4873]: I0219 09:45:14.884084 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 09:45:14 crc kubenswrapper[4873]: I0219 09:45:14.891497 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 09:45:14 crc kubenswrapper[4873]: I0219 09:45:14.897486 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 19 09:45:14 crc kubenswrapper[4873]: I0219 09:45:14.911192 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:14Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:14 crc kubenswrapper[4873]: I0219 09:45:14.937840 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:14Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:14 crc kubenswrapper[4873]: I0219 09:45:14.963574 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:14Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:14 crc kubenswrapper[4873]: I0219 09:45:14.981986 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:14Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.004488 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.027910 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.047734 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.055527 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.055814 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:45:19.055783996 +0000 UTC m=+28.345215674 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.068530 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.091343 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.111471 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.156897 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.156989 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.157038 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.157151 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157282 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157321 4873 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157341 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157376 4873 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157416 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:19.157390119 +0000 UTC m=+28.446821797 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157443 4873 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157468 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:19.15743789 +0000 UTC m=+28.446869538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157305 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157634 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:19.157601084 +0000 UTC m=+28.447032732 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157679 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157712 4873 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.157830 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:19.157777898 +0000 UTC m=+28.447209676 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.158616 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.177209 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.198870 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.223003 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.243517 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.430593 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 06:07:21.002602878 +0000 UTC Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.483734 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.483835 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.483918 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.484052 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.636965 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab"} Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.638317 4873 scope.go:117] "RemoveContainer" containerID="7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90" Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.638623 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 19 09:45:15 crc kubenswrapper[4873]: E0219 09:45:15.645720 4873 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.658498 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.682574 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.704997 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.726049 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.743181 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.758458 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.774784 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:15 crc kubenswrapper[4873]: I0219 09:45:15.793285 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:15Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.431340 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 20:12:56.967949963 +0000 UTC Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.483405 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:16 crc kubenswrapper[4873]: E0219 09:45:16.483581 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.886661 4873 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.888783 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.888840 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.888851 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.888924 4873 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.894114 4873 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.894474 4873 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.895549 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.895603 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.895619 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.895641 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.895657 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:16Z","lastTransitionTime":"2026-02-19T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:16 crc kubenswrapper[4873]: E0219 09:45:16.913449 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:16Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.917093 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.917173 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.917189 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.917211 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.917229 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:16Z","lastTransitionTime":"2026-02-19T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:16 crc kubenswrapper[4873]: E0219 09:45:16.930976 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:16Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.935620 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.935668 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.935683 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.935704 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.935719 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:16Z","lastTransitionTime":"2026-02-19T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:16 crc kubenswrapper[4873]: E0219 09:45:16.947752 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:16Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.951691 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.951732 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.951748 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.951769 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.951790 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:16Z","lastTransitionTime":"2026-02-19T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:16 crc kubenswrapper[4873]: E0219 09:45:16.965929 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:16Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.969895 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.970062 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.970201 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.970351 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:16 crc kubenswrapper[4873]: I0219 09:45:16.970473 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:16Z","lastTransitionTime":"2026-02-19T09:45:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:16 crc kubenswrapper[4873]: E0219 09:45:16.998371 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:16Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:16 crc kubenswrapper[4873]: E0219 09:45:16.998764 4873 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.000487 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.000545 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.000563 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.000592 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.000610 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:17Z","lastTransitionTime":"2026-02-19T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.091671 4873 csr.go:261] certificate signing request csr-ldd8c is approved, waiting to be issued Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.103515 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.103559 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.103572 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.103592 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.103610 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:17Z","lastTransitionTime":"2026-02-19T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.112274 4873 csr.go:257] certificate signing request csr-ldd8c is issued Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.205408 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.205442 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.205451 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.205465 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.205474 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:17Z","lastTransitionTime":"2026-02-19T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.307743 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.307791 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.307804 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.307824 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.307837 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:17Z","lastTransitionTime":"2026-02-19T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.410395 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.410440 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.410451 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.410467 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.410480 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:17Z","lastTransitionTime":"2026-02-19T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.431552 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 14:09:57.267865004 +0000 UTC Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.483209 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.483264 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:17 crc kubenswrapper[4873]: E0219 09:45:17.483358 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:17 crc kubenswrapper[4873]: E0219 09:45:17.483442 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.512615 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.512660 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.512672 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.512691 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.512704 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:17Z","lastTransitionTime":"2026-02-19T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.554083 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-4pk8x"] Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.554420 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.555408 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pp77w"] Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.555721 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pp77w" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.556707 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.556705 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.556831 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.556851 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.557231 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.557928 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.557980 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.558404 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.568850 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580683 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-system-cni-dir\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580715 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-cni-dir\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580730 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-hostroot\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580757 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-run-k8s-cni-cncf-io\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580771 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-conf-dir\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580784 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-daemon-config\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580798 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnjnw\" (UniqueName: \"kubernetes.io/projected/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-kube-api-access-vnjnw\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580812 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmt6g\" (UniqueName: \"kubernetes.io/projected/d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea-kube-api-access-dmt6g\") pod \"node-resolver-pp77w\" (UID: \"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\") " pod="openshift-dns/node-resolver-pp77w" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580825 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-socket-dir-parent\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580841 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-run-netns\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580855 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-cnibin\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580869 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-var-lib-kubelet\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580884 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-etc-kubernetes\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580903 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-cni-binary-copy\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580917 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-var-lib-cni-bin\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580930 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-var-lib-cni-multus\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580943 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-run-multus-certs\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580971 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea-hosts-file\") pod \"node-resolver-pp77w\" (UID: \"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\") " pod="openshift-dns/node-resolver-pp77w" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.580992 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-os-release\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.590007 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.609752 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.614417 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.614584 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.614656 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.614726 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.614802 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:17Z","lastTransitionTime":"2026-02-19T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.621274 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.632067 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.647737 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.658969 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.670142 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.681879 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-cnibin\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682072 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-cnibin\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682081 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-var-lib-kubelet\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682181 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-etc-kubernetes\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682225 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-var-lib-cni-multus\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682259 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-run-multus-certs\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682279 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-etc-kubernetes\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682358 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-var-lib-cni-multus\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682296 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-cni-binary-copy\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682414 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-var-lib-cni-bin\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682426 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-run-multus-certs\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682456 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea-hosts-file\") pod \"node-resolver-pp77w\" (UID: \"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\") " pod="openshift-dns/node-resolver-pp77w" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682482 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-os-release\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682518 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-system-cni-dir\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682520 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-var-lib-cni-bin\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682531 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea-hosts-file\") pod \"node-resolver-pp77w\" (UID: \"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\") " pod="openshift-dns/node-resolver-pp77w" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682539 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-cni-dir\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682595 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-hostroot\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682597 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-os-release\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682626 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-run-k8s-cni-cncf-io\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682648 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-conf-dir\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682657 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-hostroot\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682670 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-daemon-config\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682676 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-system-cni-dir\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682702 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-cni-dir\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682709 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-run-k8s-cni-cncf-io\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682714 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnjnw\" (UniqueName: \"kubernetes.io/projected/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-kube-api-access-vnjnw\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682757 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmt6g\" (UniqueName: \"kubernetes.io/projected/d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea-kube-api-access-dmt6g\") pod \"node-resolver-pp77w\" (UID: \"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\") " pod="openshift-dns/node-resolver-pp77w" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682693 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-conf-dir\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682841 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-run-netns\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682868 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-socket-dir-parent\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682906 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-run-netns\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.682936 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-socket-dir-parent\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.683044 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-cni-binary-copy\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.683412 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-host-var-lib-kubelet\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.683470 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-multus-daemon-config\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.687343 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.698297 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.713459 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnjnw\" (UniqueName: \"kubernetes.io/projected/e1ae3d8d-27cf-489f-a6ba-ef914db74bff-kube-api-access-vnjnw\") pod \"multus-4pk8x\" (UID: \"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\") " pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.716758 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmt6g\" (UniqueName: \"kubernetes.io/projected/d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea-kube-api-access-dmt6g\") pod \"node-resolver-pp77w\" (UID: \"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\") " pod="openshift-dns/node-resolver-pp77w" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.722091 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.722287 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.722377 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.722446 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.722500 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:17Z","lastTransitionTime":"2026-02-19T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.729818 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.743573 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.757651 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.775983 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.791514 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.813611 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.825877 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.825920 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.825930 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.825947 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.825957 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:17Z","lastTransitionTime":"2026-02-19T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.836632 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.859435 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.865185 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4pk8x" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.871256 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pp77w" Feb 19 09:45:17 crc kubenswrapper[4873]: W0219 09:45:17.879898 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1ae3d8d_27cf_489f_a6ba_ef914db74bff.slice/crio-5a187ab1c980a5e2fb5e0b037e5e6f00a42109df3c8de8358ae4d423f9a958e6 WatchSource:0}: Error finding container 5a187ab1c980a5e2fb5e0b037e5e6f00a42109df3c8de8358ae4d423f9a958e6: Status 404 returned error can't find the container with id 5a187ab1c980a5e2fb5e0b037e5e6f00a42109df3c8de8358ae4d423f9a958e6 Feb 19 09:45:17 crc kubenswrapper[4873]: W0219 09:45:17.885160 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8bbad50_17a6_49b3_aa6a_3d8bcf05f5ea.slice/crio-905da86833ed88c53cfd0279655c029c499ad0bee8a36d9732d486921b652bf5 WatchSource:0}: Error finding container 905da86833ed88c53cfd0279655c029c499ad0bee8a36d9732d486921b652bf5: Status 404 returned error can't find the container with id 905da86833ed88c53cfd0279655c029c499ad0bee8a36d9732d486921b652bf5 Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.886483 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.930611 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.930801 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.930886 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.930950 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.931013 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:17Z","lastTransitionTime":"2026-02-19T09:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.976253 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-n2lwn"] Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.976728 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qmsl7"] Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.976843 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.977179 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.979004 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.979346 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.979460 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.979552 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.979717 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.979909 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.979980 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 09:45:17 crc kubenswrapper[4873]: I0219 09:45:17.991028 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:17Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.003035 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.014212 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.025389 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.033135 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.033171 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.033180 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.033194 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.033203 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:18Z","lastTransitionTime":"2026-02-19T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.038197 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.060059 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.075153 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.086937 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c61760e-2955-4688-b68b-1ceeda73f356-mcd-auth-proxy-config\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.086969 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.086985 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-system-cni-dir\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.087008 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/acb9409d-e5b1-4d32-9200-8dc32d8923d2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.087183 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgmwg\" (UniqueName: \"kubernetes.io/projected/8c61760e-2955-4688-b68b-1ceeda73f356-kube-api-access-fgmwg\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.087235 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr6v4\" (UniqueName: \"kubernetes.io/projected/acb9409d-e5b1-4d32-9200-8dc32d8923d2-kube-api-access-gr6v4\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.087285 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-os-release\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.087309 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-cnibin\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.087366 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/acb9409d-e5b1-4d32-9200-8dc32d8923d2-cni-binary-copy\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.087416 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c61760e-2955-4688-b68b-1ceeda73f356-proxy-tls\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.087455 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8c61760e-2955-4688-b68b-1ceeda73f356-rootfs\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.101119 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.115035 4873 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-19 09:40:17 +0000 UTC, rotation deadline is 2026-11-13 18:55:06.537818787 +0000 UTC Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.115121 4873 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6417h9m48.422702213s for next certificate rotation Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.115048 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.125960 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.133683 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.135278 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.135313 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.135323 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.135365 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.135376 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:18Z","lastTransitionTime":"2026-02-19T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.144084 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.153395 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.164570 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.175652 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.186374 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.188659 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8c61760e-2955-4688-b68b-1ceeda73f356-rootfs\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.188747 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c61760e-2955-4688-b68b-1ceeda73f356-mcd-auth-proxy-config\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.188773 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8c61760e-2955-4688-b68b-1ceeda73f356-rootfs\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.188831 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.188946 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-system-cni-dir\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.188983 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/acb9409d-e5b1-4d32-9200-8dc32d8923d2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.189019 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-system-cni-dir\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.189044 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgmwg\" (UniqueName: \"kubernetes.io/projected/8c61760e-2955-4688-b68b-1ceeda73f356-kube-api-access-fgmwg\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.189066 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr6v4\" (UniqueName: \"kubernetes.io/projected/acb9409d-e5b1-4d32-9200-8dc32d8923d2-kube-api-access-gr6v4\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.189120 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-os-release\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.189144 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-cnibin\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.189170 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/acb9409d-e5b1-4d32-9200-8dc32d8923d2-cni-binary-copy\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.189184 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c61760e-2955-4688-b68b-1ceeda73f356-proxy-tls\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.189369 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-cnibin\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.189832 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.189396 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/acb9409d-e5b1-4d32-9200-8dc32d8923d2-os-release\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.190311 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/acb9409d-e5b1-4d32-9200-8dc32d8923d2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.190458 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c61760e-2955-4688-b68b-1ceeda73f356-mcd-auth-proxy-config\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.190410 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/acb9409d-e5b1-4d32-9200-8dc32d8923d2-cni-binary-copy\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.193895 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c61760e-2955-4688-b68b-1ceeda73f356-proxy-tls\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.201576 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.207713 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgmwg\" (UniqueName: \"kubernetes.io/projected/8c61760e-2955-4688-b68b-1ceeda73f356-kube-api-access-fgmwg\") pod \"machine-config-daemon-qmsl7\" (UID: \"8c61760e-2955-4688-b68b-1ceeda73f356\") " pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.210990 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr6v4\" (UniqueName: \"kubernetes.io/projected/acb9409d-e5b1-4d32-9200-8dc32d8923d2-kube-api-access-gr6v4\") pod \"multus-additional-cni-plugins-n2lwn\" (UID: \"acb9409d-e5b1-4d32-9200-8dc32d8923d2\") " pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.216783 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.227474 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.237451 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.237486 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.237494 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.237507 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.237517 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:18Z","lastTransitionTime":"2026-02-19T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.239302 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.247833 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.259125 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.272422 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.289687 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.294859 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:45:18 crc kubenswrapper[4873]: W0219 09:45:18.309403 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacb9409d_e5b1_4d32_9200_8dc32d8923d2.slice/crio-a59bc3455f270192605d5ae9a48a36c898f190bbee88506712627bced73d7bbb WatchSource:0}: Error finding container a59bc3455f270192605d5ae9a48a36c898f190bbee88506712627bced73d7bbb: Status 404 returned error can't find the container with id a59bc3455f270192605d5ae9a48a36c898f190bbee88506712627bced73d7bbb Feb 19 09:45:18 crc kubenswrapper[4873]: W0219 09:45:18.311794 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c61760e_2955_4688_b68b_1ceeda73f356.slice/crio-80c8293e37eae8d5fd3b9053b4e8e6a37966234aa3490061d14d879f58dacde0 WatchSource:0}: Error finding container 80c8293e37eae8d5fd3b9053b4e8e6a37966234aa3490061d14d879f58dacde0: Status 404 returned error can't find the container with id 80c8293e37eae8d5fd3b9053b4e8e6a37966234aa3490061d14d879f58dacde0 Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.340084 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.340217 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.340237 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.340259 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.340276 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:18Z","lastTransitionTime":"2026-02-19T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.360834 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j94bh"] Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.362371 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.367424 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.367660 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.367791 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.367829 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.368402 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.368557 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.368754 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.381421 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390510 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-ovn\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390548 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-systemd-units\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390572 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-log-socket\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390593 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-var-lib-openvswitch\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390613 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-script-lib\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390642 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-netd\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390661 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390685 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-openvswitch\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390706 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-ovn-kubernetes\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390727 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-etc-openvswitch\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390751 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-systemd\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390784 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7760a15-9ea0-42f0-b42b-72de30071d14-ovn-node-metrics-cert\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390842 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-env-overrides\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390945 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-config\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.390996 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-kubelet\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.391029 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-bin\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.391049 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz7vl\" (UniqueName: \"kubernetes.io/projected/a7760a15-9ea0-42f0-b42b-72de30071d14-kube-api-access-vz7vl\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.391072 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-slash\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.391140 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-netns\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.391166 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-node-log\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.394561 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.409869 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.426780 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.432508 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 09:25:22.22645834 +0000 UTC Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.441961 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.444773 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.444811 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.444822 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.444865 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.444879 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:18Z","lastTransitionTime":"2026-02-19T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.457534 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.469221 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.482573 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.483801 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:18 crc kubenswrapper[4873]: E0219 09:45:18.483913 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492069 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-systemd-units\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492169 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-ovn\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492199 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-log-socket\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492224 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-var-lib-openvswitch\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492247 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-script-lib\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492292 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-netd\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492330 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492358 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-openvswitch\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492380 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-ovn-kubernetes\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492405 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-etc-openvswitch\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492427 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-systemd\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492452 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7760a15-9ea0-42f0-b42b-72de30071d14-ovn-node-metrics-cert\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492484 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-env-overrides\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492529 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-config\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492555 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-kubelet\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492590 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-bin\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492664 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz7vl\" (UniqueName: \"kubernetes.io/projected/a7760a15-9ea0-42f0-b42b-72de30071d14-kube-api-access-vz7vl\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492690 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-slash\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492714 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-netns\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492739 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-node-log\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492866 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-node-log\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492924 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-systemd-units\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492957 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-ovn\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.492998 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-log-socket\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.493031 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-var-lib-openvswitch\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.494191 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-openvswitch\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.494284 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-netd\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.494331 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.494372 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-ovn-kubernetes\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.494396 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-etc-openvswitch\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.494432 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-bin\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.494474 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-systemd\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.494532 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-slash\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.494999 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-env-overrides\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.495068 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-kubelet\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.495134 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-netns\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.496299 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-config\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.497229 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-script-lib\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.497184 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.516116 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7760a15-9ea0-42f0-b42b-72de30071d14-ovn-node-metrics-cert\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.521772 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.527983 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz7vl\" (UniqueName: \"kubernetes.io/projected/a7760a15-9ea0-42f0-b42b-72de30071d14-kube-api-access-vz7vl\") pod \"ovnkube-node-j94bh\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.530503 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.544755 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.546930 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.546971 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.546983 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.547000 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.547012 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:18Z","lastTransitionTime":"2026-02-19T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.558947 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.656865 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.656929 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.656945 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.656968 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.656989 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:18Z","lastTransitionTime":"2026-02-19T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.658183 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.658245 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.658257 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"80c8293e37eae8d5fd3b9053b4e8e6a37966234aa3490061d14d879f58dacde0"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.659328 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pp77w" event={"ID":"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea","Type":"ContainerStarted","Data":"28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.659376 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pp77w" event={"ID":"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea","Type":"ContainerStarted","Data":"905da86833ed88c53cfd0279655c029c499ad0bee8a36d9732d486921b652bf5"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.661557 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" event={"ID":"acb9409d-e5b1-4d32-9200-8dc32d8923d2","Type":"ContainerStarted","Data":"844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.661586 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" event={"ID":"acb9409d-e5b1-4d32-9200-8dc32d8923d2","Type":"ContainerStarted","Data":"a59bc3455f270192605d5ae9a48a36c898f190bbee88506712627bced73d7bbb"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.662965 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4pk8x" event={"ID":"e1ae3d8d-27cf-489f-a6ba-ef914db74bff","Type":"ContainerStarted","Data":"6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.663023 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4pk8x" event={"ID":"e1ae3d8d-27cf-489f-a6ba-ef914db74bff","Type":"ContainerStarted","Data":"5a187ab1c980a5e2fb5e0b037e5e6f00a42109df3c8de8358ae4d423f9a958e6"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.673779 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.686536 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.697752 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.699942 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.715457 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: W0219 09:45:18.720799 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7760a15_9ea0_42f0_b42b_72de30071d14.slice/crio-542002b7bbc20e4e4f7ed68e13539e1b5d49a0679ef11d6b86cc15c762bc318b WatchSource:0}: Error finding container 542002b7bbc20e4e4f7ed68e13539e1b5d49a0679ef11d6b86cc15c762bc318b: Status 404 returned error can't find the container with id 542002b7bbc20e4e4f7ed68e13539e1b5d49a0679ef11d6b86cc15c762bc318b Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.733593 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.751077 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.762476 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.762525 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.762539 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.762557 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.762567 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:18Z","lastTransitionTime":"2026-02-19T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.767498 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.784838 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.804688 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.823865 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.840732 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.861940 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.864339 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.864376 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.864386 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.864402 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.864413 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:18Z","lastTransitionTime":"2026-02-19T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.883820 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.894797 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.911541 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.950022 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.967398 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.967461 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.967479 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.967503 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.967520 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:18Z","lastTransitionTime":"2026-02-19T09:45:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:18 crc kubenswrapper[4873]: I0219 09:45:18.985639 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.024369 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.069319 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.069362 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.069371 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.069385 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.069396 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:19Z","lastTransitionTime":"2026-02-19T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.071733 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.099785 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.099958 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:45:27.099931042 +0000 UTC m=+36.389362680 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.105091 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.146413 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.171831 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.172070 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.172201 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.172327 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.172415 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:19Z","lastTransitionTime":"2026-02-19T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.198180 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.201488 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.201730 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.201838 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.201935 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.201623 4873 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.201839 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.202201 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.201883 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.202242 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.202256 4873 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.202300 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:27.202284913 +0000 UTC m=+36.491716551 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.202024 4873 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.202330 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:27.202323814 +0000 UTC m=+36.491755452 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.202225 4873 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.202357 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:27.202352645 +0000 UTC m=+36.491784283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.202599 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:27.20257682 +0000 UTC m=+36.492008468 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.233693 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.263894 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.276740 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.276774 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.276783 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.276801 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.276810 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:19Z","lastTransitionTime":"2026-02-19T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.307869 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.349869 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.379720 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.379760 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.379773 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.379796 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.379809 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:19Z","lastTransitionTime":"2026-02-19T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.432769 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 11:11:37.787616531 +0000 UTC Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.482036 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.482366 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.482379 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.482394 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.482403 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:19Z","lastTransitionTime":"2026-02-19T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.483161 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.483260 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.483609 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:19 crc kubenswrapper[4873]: E0219 09:45:19.483674 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.503989 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kbv7k"] Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.504340 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kbv7k" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.506004 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.506160 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.506050 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.506891 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.517765 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.530092 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.548289 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.585228 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.585455 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.585556 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.585643 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.585718 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:19Z","lastTransitionTime":"2026-02-19T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.590360 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.605641 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33fdff17-cdda-468e-8520-7f0937acd8db-host\") pod \"node-ca-kbv7k\" (UID: \"33fdff17-cdda-468e-8520-7f0937acd8db\") " pod="openshift-image-registry/node-ca-kbv7k" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.605687 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gql7d\" (UniqueName: \"kubernetes.io/projected/33fdff17-cdda-468e-8520-7f0937acd8db-kube-api-access-gql7d\") pod \"node-ca-kbv7k\" (UID: \"33fdff17-cdda-468e-8520-7f0937acd8db\") " pod="openshift-image-registry/node-ca-kbv7k" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.605855 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/33fdff17-cdda-468e-8520-7f0937acd8db-serviceca\") pod \"node-ca-kbv7k\" (UID: \"33fdff17-cdda-468e-8520-7f0937acd8db\") " pod="openshift-image-registry/node-ca-kbv7k" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.626762 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.667874 4873 generic.go:334] "Generic (PLEG): container finished" podID="acb9409d-e5b1-4d32-9200-8dc32d8923d2" containerID="844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795" exitCode=0 Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.667981 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" event={"ID":"acb9409d-e5b1-4d32-9200-8dc32d8923d2","Type":"ContainerDied","Data":"844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.670665 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b" exitCode=0 Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.670762 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.670848 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"542002b7bbc20e4e4f7ed68e13539e1b5d49a0679ef11d6b86cc15c762bc318b"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.683225 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.689349 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.689399 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.689414 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.689432 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.689444 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:19Z","lastTransitionTime":"2026-02-19T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.707062 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33fdff17-cdda-468e-8520-7f0937acd8db-host\") pod \"node-ca-kbv7k\" (UID: \"33fdff17-cdda-468e-8520-7f0937acd8db\") " pod="openshift-image-registry/node-ca-kbv7k" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.707194 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gql7d\" (UniqueName: \"kubernetes.io/projected/33fdff17-cdda-468e-8520-7f0937acd8db-kube-api-access-gql7d\") pod \"node-ca-kbv7k\" (UID: \"33fdff17-cdda-468e-8520-7f0937acd8db\") " pod="openshift-image-registry/node-ca-kbv7k" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.707481 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/33fdff17-cdda-468e-8520-7f0937acd8db-serviceca\") pod \"node-ca-kbv7k\" (UID: \"33fdff17-cdda-468e-8520-7f0937acd8db\") " pod="openshift-image-registry/node-ca-kbv7k" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.707626 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33fdff17-cdda-468e-8520-7f0937acd8db-host\") pod \"node-ca-kbv7k\" (UID: \"33fdff17-cdda-468e-8520-7f0937acd8db\") " pod="openshift-image-registry/node-ca-kbv7k" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.712278 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/33fdff17-cdda-468e-8520-7f0937acd8db-serviceca\") pod \"node-ca-kbv7k\" (UID: \"33fdff17-cdda-468e-8520-7f0937acd8db\") " pod="openshift-image-registry/node-ca-kbv7k" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.721224 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.741031 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gql7d\" (UniqueName: \"kubernetes.io/projected/33fdff17-cdda-468e-8520-7f0937acd8db-kube-api-access-gql7d\") pod \"node-ca-kbv7k\" (UID: \"33fdff17-cdda-468e-8520-7f0937acd8db\") " pod="openshift-image-registry/node-ca-kbv7k" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.766494 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.797632 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.797672 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.797687 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.797706 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.797718 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:19Z","lastTransitionTime":"2026-02-19T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.797936 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kbv7k" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.809241 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: W0219 09:45:19.814386 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33fdff17_cdda_468e_8520_7f0937acd8db.slice/crio-4958538e56e82b562971372acc5abe128e2607afeb4e6289824a3e581a5b5f31 WatchSource:0}: Error finding container 4958538e56e82b562971372acc5abe128e2607afeb4e6289824a3e581a5b5f31: Status 404 returned error can't find the container with id 4958538e56e82b562971372acc5abe128e2607afeb4e6289824a3e581a5b5f31 Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.845564 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.887463 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.907136 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.908373 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.908392 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.908436 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.908447 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:19Z","lastTransitionTime":"2026-02-19T09:45:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.929066 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:19 crc kubenswrapper[4873]: I0219 09:45:19.968391 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.008998 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.012554 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.012584 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.012613 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.012630 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.012639 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:20Z","lastTransitionTime":"2026-02-19T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.045331 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.084208 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.114589 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.114632 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.114643 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.114659 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.114670 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:20Z","lastTransitionTime":"2026-02-19T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.128210 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.168122 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.206861 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.216578 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.216617 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.216626 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.216640 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.216651 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:20Z","lastTransitionTime":"2026-02-19T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.245059 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.287163 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.318808 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.318848 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.318857 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.318870 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.318881 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:20Z","lastTransitionTime":"2026-02-19T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.332673 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.365584 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.407583 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.422014 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.422061 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.422072 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.422088 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.422121 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:20Z","lastTransitionTime":"2026-02-19T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.434259 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 06:10:32.494306312 +0000 UTC Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.445806 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.484087 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:20 crc kubenswrapper[4873]: E0219 09:45:20.484240 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.485194 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.524026 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.524060 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.524068 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.524088 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.524098 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:20Z","lastTransitionTime":"2026-02-19T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.533876 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.567702 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.625945 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.625984 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.625995 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.626010 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.626020 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:20Z","lastTransitionTime":"2026-02-19T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.676456 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kbv7k" event={"ID":"33fdff17-cdda-468e-8520-7f0937acd8db","Type":"ContainerStarted","Data":"e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.676525 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kbv7k" event={"ID":"33fdff17-cdda-468e-8520-7f0937acd8db","Type":"ContainerStarted","Data":"4958538e56e82b562971372acc5abe128e2607afeb4e6289824a3e581a5b5f31"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.678251 4873 generic.go:334] "Generic (PLEG): container finished" podID="acb9409d-e5b1-4d32-9200-8dc32d8923d2" containerID="3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505" exitCode=0 Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.678333 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" event={"ID":"acb9409d-e5b1-4d32-9200-8dc32d8923d2","Type":"ContainerDied","Data":"3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.690010 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.690058 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.690087 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.690132 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.690149 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.690164 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.701592 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.720206 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.729156 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.729196 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.729213 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.729234 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.729249 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:20Z","lastTransitionTime":"2026-02-19T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.740168 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.769248 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.784597 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.808648 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.832386 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.832431 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.832452 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.832477 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.832494 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:20Z","lastTransitionTime":"2026-02-19T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.848094 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.894683 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.926197 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.935721 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.935771 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.935788 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.935812 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.935832 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:20Z","lastTransitionTime":"2026-02-19T09:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:20 crc kubenswrapper[4873]: I0219 09:45:20.964705 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.026265 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.039337 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.039377 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.039387 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.039401 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.039411 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:21Z","lastTransitionTime":"2026-02-19T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.066742 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.092029 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.129027 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.141365 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.141424 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.141442 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.141466 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.141483 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:21Z","lastTransitionTime":"2026-02-19T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.166949 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.175310 4873 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175488 4873 reflector.go:484] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175493 4873 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175541 4873 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-config": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175585 4873 reflector.go:484] object-"openshift-machine-config-operator"/"proxy-tls": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175612 4873 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175637 4873 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175654 4873 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175667 4873 reflector.go:484] object-"openshift-machine-config-operator"/"kube-rbac-proxy": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175679 4873 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175542 4873 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175630 4873 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175554 4873 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175554 4873 reflector.go:484] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175583 4873 reflector.go:484] object-"openshift-multus"/"multus-daemon-config": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175583 4873 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175579 4873 reflector.go:484] object-"openshift-ovn-kubernetes"/"env-overrides": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175582 4873 reflector.go:484] pkg/kubelet/config/apiserver.go:66: watch of *v1.Pod ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175760 4873 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175618 4873 reflector.go:484] object-"openshift-image-registry"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175770 4873 reflector.go:484] object-"openshift-image-registry"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175800 4873 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175624 4873 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175818 4873 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175830 4873 reflector.go:484] object-"openshift-multus"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175623 4873 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175849 4873 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175725 4873 reflector.go:484] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175868 4873 reflector.go:484] object-"openshift-multus"/"default-cni-sysctl-allowlist": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175772 4873 reflector.go:484] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175830 4873 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175895 4873 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175834 4873 reflector.go:484] object-"openshift-machine-config-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175912 4873 reflector.go:484] object-"openshift-multus"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175939 4873 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175961 4873 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175971 4873 reflector.go:484] object-"openshift-multus"/"cni-copy-resources": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.175963 4873 reflector.go:484] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.176002 4873 reflector.go:484] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.176009 4873 reflector.go:484] object-"openshift-multus"/"default-dockercfg-2q5b6": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: W0219 09:45:21.176016 4873 reflector.go:484] object-"openshift-image-registry"/"image-registry-certificates": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.244352 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.244383 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.244392 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.244405 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.244414 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:21Z","lastTransitionTime":"2026-02-19T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.346761 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.346807 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.346823 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.346843 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.346858 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:21Z","lastTransitionTime":"2026-02-19T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.435171 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 05:08:25.03492548 +0000 UTC Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.449203 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.449252 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.449265 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.449281 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.449292 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:21Z","lastTransitionTime":"2026-02-19T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.483743 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:21 crc kubenswrapper[4873]: E0219 09:45:21.483888 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.484020 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:21 crc kubenswrapper[4873]: E0219 09:45:21.484181 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.551860 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.551893 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.551900 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.551914 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.551924 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:21Z","lastTransitionTime":"2026-02-19T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.654163 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.654240 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.654264 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.654294 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.654317 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:21Z","lastTransitionTime":"2026-02-19T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.699151 4873 generic.go:334] "Generic (PLEG): container finished" podID="acb9409d-e5b1-4d32-9200-8dc32d8923d2" containerID="bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189" exitCode=0 Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.699217 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" event={"ID":"acb9409d-e5b1-4d32-9200-8dc32d8923d2","Type":"ContainerDied","Data":"bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189"} Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.757483 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.757531 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.757543 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.757561 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.757577 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:21Z","lastTransitionTime":"2026-02-19T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.861293 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.861362 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.861383 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.861409 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.861428 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:21Z","lastTransitionTime":"2026-02-19T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.964489 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.965010 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.965031 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.965057 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:21 crc kubenswrapper[4873]: I0219 09:45:21.965076 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:21Z","lastTransitionTime":"2026-02-19T09:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.030323 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.037834 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.038939 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.049847 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.065834 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.068008 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.068046 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.068058 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.068074 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.068085 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:22Z","lastTransitionTime":"2026-02-19T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.089479 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.093994 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.171237 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.171277 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.171286 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.171301 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.171310 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:22Z","lastTransitionTime":"2026-02-19T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.191989 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.204479 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.214305 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.221024 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.233391 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.235908 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.244061 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.254592 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.256087 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.273477 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.275412 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.275446 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.275460 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.275476 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.275488 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:22Z","lastTransitionTime":"2026-02-19T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.285831 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.293686 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.316580 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.326705 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.339346 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.353965 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.370282 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.378483 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.378548 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.378568 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.378592 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.378611 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:22Z","lastTransitionTime":"2026-02-19T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.395759 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.397148 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.402445 4873 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.410929 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.423921 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.428018 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.436404 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 10:29:46.167120903 +0000 UTC Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.437688 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.445252 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.449678 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.461932 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.475042 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.476286 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.481155 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.483229 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:22 crc kubenswrapper[4873]: E0219 09:45:22.483365 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.484654 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.484733 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.484760 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.484812 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.484833 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:22Z","lastTransitionTime":"2026-02-19T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.500303 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.519475 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.522866 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.524439 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.539510 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.549621 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.564943 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.565161 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.576852 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.577209 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.589708 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.589748 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.589776 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.589791 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.589800 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:22Z","lastTransitionTime":"2026-02-19T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.597091 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.637009 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.657451 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.678264 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.692077 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.692155 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.692172 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.692196 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.692215 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:22Z","lastTransitionTime":"2026-02-19T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.705243 4873 generic.go:334] "Generic (PLEG): container finished" podID="acb9409d-e5b1-4d32-9200-8dc32d8923d2" containerID="c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a" exitCode=0 Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.705302 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" event={"ID":"acb9409d-e5b1-4d32-9200-8dc32d8923d2","Type":"ContainerDied","Data":"c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.712060 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.712587 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.717441 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.736891 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.756747 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.794413 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.794451 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.794463 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.794478 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.794490 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:22Z","lastTransitionTime":"2026-02-19T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.796484 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.817046 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.836716 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.856379 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.876382 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.896758 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.896790 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.896804 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.896820 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.896832 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:22Z","lastTransitionTime":"2026-02-19T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.903151 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.976915 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.990375 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.999509 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.999549 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.999563 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.999583 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:22 crc kubenswrapper[4873]: I0219 09:45:22.999598 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:22Z","lastTransitionTime":"2026-02-19T09:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.030545 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.064587 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.102561 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.102616 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.102632 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.102653 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.102668 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:23Z","lastTransitionTime":"2026-02-19T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.105923 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.149670 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.187034 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.206475 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.206547 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.206567 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.206592 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.206612 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:23Z","lastTransitionTime":"2026-02-19T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.229668 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.264450 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.311147 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.311421 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.311467 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.311478 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.311498 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.311509 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:23Z","lastTransitionTime":"2026-02-19T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.349461 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.384568 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.413731 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.413769 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.413783 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.413799 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.413810 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:23Z","lastTransitionTime":"2026-02-19T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.433287 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.436593 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 06:40:55.084003919 +0000 UTC Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.465429 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.485904 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:23 crc kubenswrapper[4873]: E0219 09:45:23.486025 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.486431 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:23 crc kubenswrapper[4873]: E0219 09:45:23.486494 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.508868 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.518340 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.518387 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.518397 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.518415 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.518425 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:23Z","lastTransitionTime":"2026-02-19T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.551663 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.596414 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.625139 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.625187 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.625201 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.625217 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.625229 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:23Z","lastTransitionTime":"2026-02-19T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.627482 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.672582 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.708538 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.719408 4873 generic.go:334] "Generic (PLEG): container finished" podID="acb9409d-e5b1-4d32-9200-8dc32d8923d2" containerID="34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6" exitCode=0 Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.719467 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" event={"ID":"acb9409d-e5b1-4d32-9200-8dc32d8923d2","Type":"ContainerDied","Data":"34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6"} Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.726933 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.726962 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.726973 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.727005 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.727016 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:23Z","lastTransitionTime":"2026-02-19T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.750182 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.792073 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.828958 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.829905 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.829951 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.829966 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.829986 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.830001 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:23Z","lastTransitionTime":"2026-02-19T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.863740 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.904867 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.932452 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.932491 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.932505 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.932524 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.932537 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:23Z","lastTransitionTime":"2026-02-19T09:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.957241 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:23 crc kubenswrapper[4873]: I0219 09:45:23.983824 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.030402 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.035381 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.035404 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.035413 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.035425 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.035434 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:24Z","lastTransitionTime":"2026-02-19T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.064937 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.105510 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.139991 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.140046 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.140062 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.140087 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.140132 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:24Z","lastTransitionTime":"2026-02-19T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.150870 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.191230 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.234771 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.243143 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.243201 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.243224 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.243253 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.243275 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:24Z","lastTransitionTime":"2026-02-19T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.270703 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.309167 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.351563 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.351640 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.351659 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.351684 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.352366 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:24Z","lastTransitionTime":"2026-02-19T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.367137 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.386153 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.436511 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.437316 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 16:58:19.131212465 +0000 UTC Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.456993 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.457033 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.457043 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.457056 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.457066 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:24Z","lastTransitionTime":"2026-02-19T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.472954 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.483315 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:24 crc kubenswrapper[4873]: E0219 09:45:24.483558 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.513460 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.552380 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.560327 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.560391 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.560416 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.560444 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.560466 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:24Z","lastTransitionTime":"2026-02-19T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.589516 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.636781 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.663703 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.663755 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.663772 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.663820 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.663836 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:24Z","lastTransitionTime":"2026-02-19T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.669841 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.729624 4873 generic.go:334] "Generic (PLEG): container finished" podID="acb9409d-e5b1-4d32-9200-8dc32d8923d2" containerID="5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39" exitCode=0 Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.729704 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" event={"ID":"acb9409d-e5b1-4d32-9200-8dc32d8923d2","Type":"ContainerDied","Data":"5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39"} Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.747031 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.760635 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.767278 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.767342 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.767365 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.767396 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.767419 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:24Z","lastTransitionTime":"2026-02-19T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.787529 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.827024 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.866795 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.869234 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.869264 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.869274 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.869288 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.869298 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:24Z","lastTransitionTime":"2026-02-19T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.905376 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.944159 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.972016 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.972063 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.972080 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.972127 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.972146 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:24Z","lastTransitionTime":"2026-02-19T09:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:24 crc kubenswrapper[4873]: I0219 09:45:24.984144 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.036245 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.071132 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.074601 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.074671 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.074702 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.074746 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.074786 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:25Z","lastTransitionTime":"2026-02-19T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.104425 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.144454 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.177358 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.177400 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.177412 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.177430 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.177442 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:25Z","lastTransitionTime":"2026-02-19T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.189338 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.224443 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.279236 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.279275 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.279287 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.279302 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.279315 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:25Z","lastTransitionTime":"2026-02-19T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.383182 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.383237 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.383258 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.383283 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.383302 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:25Z","lastTransitionTime":"2026-02-19T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.438013 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 12:40:11.372454571 +0000 UTC Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.484004 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.484040 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:25 crc kubenswrapper[4873]: E0219 09:45:25.484250 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:25 crc kubenswrapper[4873]: E0219 09:45:25.484433 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.491054 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.491148 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.491177 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.491226 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.491298 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:25Z","lastTransitionTime":"2026-02-19T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.594345 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.594397 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.594411 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.594430 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.594443 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:25Z","lastTransitionTime":"2026-02-19T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.697680 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.697728 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.697743 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.697763 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.697776 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:25Z","lastTransitionTime":"2026-02-19T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.735973 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" event={"ID":"acb9409d-e5b1-4d32-9200-8dc32d8923d2","Type":"ContainerStarted","Data":"566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e"} Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.744781 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019"} Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.745815 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.745917 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.752137 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.764830 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.770320 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.770425 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.779310 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.791420 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.800480 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.800519 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.800530 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.800546 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.800556 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:25Z","lastTransitionTime":"2026-02-19T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.804612 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.817392 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.828179 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.839565 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.852722 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.869773 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.880679 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.898141 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.903540 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.903598 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.903616 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.903638 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.903655 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:25Z","lastTransitionTime":"2026-02-19T09:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.909289 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.923034 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.936091 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.949980 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.965001 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.982333 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:25 crc kubenswrapper[4873]: I0219 09:45:25.995378 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.006962 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.007017 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.007035 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.007058 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.007076 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:26Z","lastTransitionTime":"2026-02-19T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.012403 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.031668 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.078468 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.104937 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.109585 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.109638 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.109652 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.109697 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.109710 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:26Z","lastTransitionTime":"2026-02-19T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.156419 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.185089 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.212496 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.212547 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.212606 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.212633 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.212650 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:26Z","lastTransitionTime":"2026-02-19T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.230346 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.269866 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.304746 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.315529 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.315595 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.315613 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.315638 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.315658 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:26Z","lastTransitionTime":"2026-02-19T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.347639 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.419247 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.419393 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.419419 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.419449 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.419475 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:26Z","lastTransitionTime":"2026-02-19T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.438564 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 17:34:17.44741874 +0000 UTC Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.484075 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:26 crc kubenswrapper[4873]: E0219 09:45:26.484567 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.522181 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.522272 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.522296 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.522321 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.522339 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:26Z","lastTransitionTime":"2026-02-19T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.625476 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.625533 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.625551 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.625575 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.625592 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:26Z","lastTransitionTime":"2026-02-19T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.728379 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.728456 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.728474 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.728506 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.728529 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:26Z","lastTransitionTime":"2026-02-19T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.831500 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.831885 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.832019 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.832202 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.832339 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:26Z","lastTransitionTime":"2026-02-19T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.935724 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.935791 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.935816 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.935848 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:26 crc kubenswrapper[4873]: I0219 09:45:26.935870 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:26Z","lastTransitionTime":"2026-02-19T09:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.037959 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.038011 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.038037 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.038051 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.038059 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.110222 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.110440 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:45:43.110413012 +0000 UTC m=+52.399844670 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.111983 4873 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.139991 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.140025 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.140034 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.140047 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.140055 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.163783 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.163834 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.163844 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.163875 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.163885 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.176412 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.179313 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.179334 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.179344 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.179357 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.179366 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.194984 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.198494 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.198547 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.198560 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.198576 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.198590 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.210641 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.210857 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.210909 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.210943 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.210970 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211034 4873 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211042 4873 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211074 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211093 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211123 4873 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211074 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211150 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211161 4873 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211082 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:43.211068951 +0000 UTC m=+52.500500589 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211209 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:43.211197094 +0000 UTC m=+52.500628732 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211226 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:43.211218914 +0000 UTC m=+52.500650552 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.211241 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:43.211233295 +0000 UTC m=+52.500664933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.213977 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.214074 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.214177 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.214263 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.214343 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.226409 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.230181 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.230327 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.230434 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.230559 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.230652 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.242552 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.242877 4873 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.244570 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.244672 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.244752 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.244842 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.244914 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.347803 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.348038 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.348135 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.348230 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.348337 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.355847 4873 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.438801 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 18:57:57.499660517 +0000 UTC Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.452598 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.452862 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.452894 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.452926 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.452962 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.483650 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.483767 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.483816 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:27 crc kubenswrapper[4873]: E0219 09:45:27.484076 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.556302 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.556355 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.556374 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.556396 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.556411 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.659127 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.659189 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.659207 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.659232 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.659249 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.761181 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.761211 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.761219 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.761231 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.761239 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.864113 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.864341 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.864409 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.864502 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.864567 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.968341 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.968643 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.968818 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.969025 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:27 crc kubenswrapper[4873]: I0219 09:45:27.969446 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:27Z","lastTransitionTime":"2026-02-19T09:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.072996 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.073063 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.073084 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.073137 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.073160 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:28Z","lastTransitionTime":"2026-02-19T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.175453 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.175497 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.175507 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.175522 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.175534 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:28Z","lastTransitionTime":"2026-02-19T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.278150 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.278213 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.278231 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.278259 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.278278 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:28Z","lastTransitionTime":"2026-02-19T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.380603 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.380645 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.380661 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.380681 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.380697 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:28Z","lastTransitionTime":"2026-02-19T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.440353 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 01:40:47.567049124 +0000 UTC Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.483842 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:28 crc kubenswrapper[4873]: E0219 09:45:28.483994 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.484420 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.484451 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.484468 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.484489 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.484505 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:28Z","lastTransitionTime":"2026-02-19T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.587668 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.587766 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.587784 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.587843 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.587861 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:28Z","lastTransitionTime":"2026-02-19T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.626408 4873 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.691733 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.691833 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.691872 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.691940 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.691981 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:28Z","lastTransitionTime":"2026-02-19T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.757687 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/0.log" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.761862 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019" exitCode=1 Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.761927 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019"} Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.763434 4873 scope.go:117] "RemoveContainer" containerID="2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.780741 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.795061 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.795451 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.795484 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.795492 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.795506 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.795520 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:28Z","lastTransitionTime":"2026-02-19T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.810820 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.828809 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.839889 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.855228 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.883371 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"ce (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736581 6188 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736599 6188 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736654 6188 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736805 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736917 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.737306 6188 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.737480 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:27.737633 6188 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.738132 6188 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.897836 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.897883 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.897904 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.897932 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.897957 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:28Z","lastTransitionTime":"2026-02-19T09:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.899718 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.927992 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.951897 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.970365 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.984641 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:28 crc kubenswrapper[4873]: I0219 09:45:28.997382 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.000724 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.000775 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.000804 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.000826 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.000841 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:29Z","lastTransitionTime":"2026-02-19T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.011452 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.103907 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.104331 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.104371 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.104485 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.104511 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:29Z","lastTransitionTime":"2026-02-19T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.207818 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.207873 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.207886 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.207905 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.207918 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:29Z","lastTransitionTime":"2026-02-19T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.310377 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.310420 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.310437 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.310460 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.310478 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:29Z","lastTransitionTime":"2026-02-19T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.413526 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.413587 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.413607 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.413631 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.413647 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:29Z","lastTransitionTime":"2026-02-19T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.441495 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 16:08:59.738202622 +0000 UTC Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.483617 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.483684 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:29 crc kubenswrapper[4873]: E0219 09:45:29.484061 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:29 crc kubenswrapper[4873]: E0219 09:45:29.484291 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.484454 4873 scope.go:117] "RemoveContainer" containerID="7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.520144 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.520217 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.520239 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.520275 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.520299 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:29Z","lastTransitionTime":"2026-02-19T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.622483 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.622515 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.622523 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.622536 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.622546 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:29Z","lastTransitionTime":"2026-02-19T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.724885 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.724916 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.724925 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.724939 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.724949 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:29Z","lastTransitionTime":"2026-02-19T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.766427 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.768144 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.768602 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.770539 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/0.log" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.773243 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.773808 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.785929 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.800326 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.811166 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.826890 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.827658 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.827694 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.827707 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.827723 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.827733 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:29Z","lastTransitionTime":"2026-02-19T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.843480 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.855344 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.869980 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.887578 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"ce (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736581 6188 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736599 6188 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736654 6188 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736805 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736917 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.737306 6188 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.737480 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:27.737633 6188 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.738132 6188 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.900000 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.913941 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.927648 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.930069 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.930100 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.930137 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.930159 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.930172 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:29Z","lastTransitionTime":"2026-02-19T09:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.943485 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.961071 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:29 crc kubenswrapper[4873]: I0219 09:45:29.980795 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.011227 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.032749 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.032791 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.032801 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.032816 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.032825 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:30Z","lastTransitionTime":"2026-02-19T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.037825 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.053195 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.066281 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.076344 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.087737 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.097859 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.109370 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.125922 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.135399 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.135427 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.135452 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.135466 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.135475 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:30Z","lastTransitionTime":"2026-02-19T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.139559 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.148972 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.161718 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.192393 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"ce (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736581 6188 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736599 6188 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736654 6188 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736805 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736917 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.737306 6188 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.737480 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:27.737633 6188 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.738132 6188 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.205866 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.237908 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.237962 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.237981 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.238007 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.238024 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:30Z","lastTransitionTime":"2026-02-19T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.340538 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.340588 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.340601 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.340620 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.340632 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:30Z","lastTransitionTime":"2026-02-19T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.385859 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb"] Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.386305 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.388092 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.389888 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.407945 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.426244 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.441685 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 13:08:57.28899949 +0000 UTC Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.443972 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.444014 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.444026 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.444041 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.444055 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:30Z","lastTransitionTime":"2026-02-19T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.446973 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.475885 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"ce (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736581 6188 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736599 6188 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736654 6188 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736805 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736917 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.737306 6188 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.737480 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:27.737633 6188 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.738132 6188 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.484135 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:30 crc kubenswrapper[4873]: E0219 09:45:30.484309 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.491317 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.505627 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.518470 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.534041 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.546899 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.546942 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.546956 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.546972 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.546982 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:30Z","lastTransitionTime":"2026-02-19T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.547669 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/566305a3-ea47-4e60-b247-5b32fa8544e2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.547824 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/566305a3-ea47-4e60-b247-5b32fa8544e2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.547921 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpvr5\" (UniqueName: \"kubernetes.io/projected/566305a3-ea47-4e60-b247-5b32fa8544e2-kube-api-access-gpvr5\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.548038 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/566305a3-ea47-4e60-b247-5b32fa8544e2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.549009 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.560739 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.575172 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.589864 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.606005 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.619594 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.635197 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.648537 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/566305a3-ea47-4e60-b247-5b32fa8544e2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.648614 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/566305a3-ea47-4e60-b247-5b32fa8544e2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.648660 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpvr5\" (UniqueName: \"kubernetes.io/projected/566305a3-ea47-4e60-b247-5b32fa8544e2-kube-api-access-gpvr5\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.648715 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/566305a3-ea47-4e60-b247-5b32fa8544e2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.650362 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.650421 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.650432 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.650447 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.650458 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:30Z","lastTransitionTime":"2026-02-19T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.650670 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/566305a3-ea47-4e60-b247-5b32fa8544e2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.651483 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/566305a3-ea47-4e60-b247-5b32fa8544e2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.655637 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/566305a3-ea47-4e60-b247-5b32fa8544e2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.680518 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpvr5\" (UniqueName: \"kubernetes.io/projected/566305a3-ea47-4e60-b247-5b32fa8544e2-kube-api-access-gpvr5\") pod \"ovnkube-control-plane-749d76644c-t7gjb\" (UID: \"566305a3-ea47-4e60-b247-5b32fa8544e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.705522 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.753325 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.753360 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.753372 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.753388 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.753399 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:30Z","lastTransitionTime":"2026-02-19T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.775860 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/1.log" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.776703 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/0.log" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.778676 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3" exitCode=1 Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.778717 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.778748 4873 scope.go:117] "RemoveContainer" containerID="2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.779376 4873 scope.go:117] "RemoveContainer" containerID="37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3" Feb 19 09:45:30 crc kubenswrapper[4873]: E0219 09:45:30.779500 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.783334 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" event={"ID":"566305a3-ea47-4e60-b247-5b32fa8544e2","Type":"ContainerStarted","Data":"3e9f8c301ee914f3296e8a090b69b2915883c93b034aab6deaa4dde2271ff8b7"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.793374 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.807596 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.817780 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.827018 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.842046 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.855186 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.855594 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.855661 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.855725 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.855791 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:30Z","lastTransitionTime":"2026-02-19T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.855177 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.869767 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.882574 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.897336 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.910374 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.924210 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.942856 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"ce (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736581 6188 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736599 6188 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736654 6188 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736805 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736917 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.737306 6188 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.737480 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:27.737633 6188 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.738132 6188 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:30.029768 6333 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0219 09:45:30.032587 6333 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 09:45:30.032616 6333 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 09:45:30.032670 6333 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 09:45:30.032733 6333 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 09:45:30.032745 6333 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:30.032824 6333 factory.go:656] Stopping watch factory\\\\nI0219 09:45:30.032844 6333 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:30.032881 6333 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:30.032897 6333 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:30.032908 6333 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:30.032931 6333 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.952620 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.958760 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.958790 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.958799 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.958816 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.958824 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:30Z","lastTransitionTime":"2026-02-19T09:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.966921 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:30 crc kubenswrapper[4873]: I0219 09:45:30.977484 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.061547 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.061593 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.061610 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.061630 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.061643 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:31Z","lastTransitionTime":"2026-02-19T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.164629 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.164694 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.164712 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.164736 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.164755 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:31Z","lastTransitionTime":"2026-02-19T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.267720 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.267756 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.267766 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.267781 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.267790 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:31Z","lastTransitionTime":"2026-02-19T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.370019 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.370095 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.370148 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.370173 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.370199 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:31Z","lastTransitionTime":"2026-02-19T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.442404 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 15:50:20.332249366 +0000 UTC Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.472437 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.472475 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.472484 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.472498 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.472507 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:31Z","lastTransitionTime":"2026-02-19T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.483874 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.483971 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:31 crc kubenswrapper[4873]: E0219 09:45:31.484029 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:31 crc kubenswrapper[4873]: E0219 09:45:31.484518 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.505867 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.517690 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.534204 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.550433 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.566717 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.574909 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.574979 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.575004 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.575059 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.575086 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:31Z","lastTransitionTime":"2026-02-19T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.587745 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.609983 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.621971 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.638925 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.654029 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.665528 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.678483 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.678737 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.678890 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.679046 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.679232 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:31Z","lastTransitionTime":"2026-02-19T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.694262 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"ce (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736581 6188 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736599 6188 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736654 6188 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736805 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736917 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.737306 6188 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.737480 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:27.737633 6188 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.738132 6188 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:30.029768 6333 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0219 09:45:30.032587 6333 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 09:45:30.032616 6333 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 09:45:30.032670 6333 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 09:45:30.032733 6333 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 09:45:30.032745 6333 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:30.032824 6333 factory.go:656] Stopping watch factory\\\\nI0219 09:45:30.032844 6333 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:30.032881 6333 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:30.032897 6333 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:30.032908 6333 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:30.032931 6333 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.710536 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.732776 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.749377 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.781917 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.781954 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.781965 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.781980 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.781990 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:31Z","lastTransitionTime":"2026-02-19T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.786966 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" event={"ID":"566305a3-ea47-4e60-b247-5b32fa8544e2","Type":"ContainerStarted","Data":"5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.787035 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" event={"ID":"566305a3-ea47-4e60-b247-5b32fa8544e2","Type":"ContainerStarted","Data":"96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.788899 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/1.log" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.794765 4873 scope.go:117] "RemoveContainer" containerID="37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3" Feb 19 09:45:31 crc kubenswrapper[4873]: E0219 09:45:31.795068 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.804168 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.816608 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.831811 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.846840 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.864881 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.877830 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-lcp8k"] Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.878504 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:31 crc kubenswrapper[4873]: E0219 09:45:31.878617 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.882541 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.883618 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.883691 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.883715 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.883744 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.883767 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:31Z","lastTransitionTime":"2026-02-19T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.897490 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.918123 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.936326 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.957931 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.963696 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvs9b\" (UniqueName: \"kubernetes.io/projected/98d35597-056d-48f0-b599-28b098dd45f3-kube-api-access-rvs9b\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.963878 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.978331 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fb8dc98187d774ae64314b294ea5d477d9205175171422f2ecb5e08a948f019\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:27Z\\\",\\\"message\\\":\\\"ce (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736581 6188 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736599 6188 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736654 6188 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736805 6188 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.736917 6188 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.737306 6188 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 09:45:27.737480 6188 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:27.737633 6188 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 09:45:27.738132 6188 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:30.029768 6333 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0219 09:45:30.032587 6333 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 09:45:30.032616 6333 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 09:45:30.032670 6333 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 09:45:30.032733 6333 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 09:45:30.032745 6333 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:30.032824 6333 factory.go:656] Stopping watch factory\\\\nI0219 09:45:30.032844 6333 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:30.032881 6333 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:30.032897 6333 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:30.032908 6333 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:30.032931 6333 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.986163 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.986238 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.986256 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.986281 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.986298 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:31Z","lastTransitionTime":"2026-02-19T09:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:31 crc kubenswrapper[4873]: I0219 09:45:31.988996 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.002972 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.015649 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.027148 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.043051 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.061402 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.064835 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvs9b\" (UniqueName: \"kubernetes.io/projected/98d35597-056d-48f0-b599-28b098dd45f3-kube-api-access-rvs9b\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.064975 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:32 crc kubenswrapper[4873]: E0219 09:45:32.065173 4873 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:32 crc kubenswrapper[4873]: E0219 09:45:32.065287 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs podName:98d35597-056d-48f0-b599-28b098dd45f3 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:32.565255344 +0000 UTC m=+41.854687022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs") pod "network-metrics-daemon-lcp8k" (UID: "98d35597-056d-48f0-b599-28b098dd45f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.080083 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.089612 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.089651 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.089661 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.089678 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.089690 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:32Z","lastTransitionTime":"2026-02-19T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.092208 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvs9b\" (UniqueName: \"kubernetes.io/projected/98d35597-056d-48f0-b599-28b098dd45f3-kube-api-access-rvs9b\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.100573 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.119351 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.137390 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.155439 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.176480 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.191812 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.191860 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.191876 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.191897 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.191912 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:32Z","lastTransitionTime":"2026-02-19T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.194431 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.213371 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.242951 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:30.029768 6333 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0219 09:45:30.032587 6333 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 09:45:30.032616 6333 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 09:45:30.032670 6333 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 09:45:30.032733 6333 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 09:45:30.032745 6333 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:30.032824 6333 factory.go:656] Stopping watch factory\\\\nI0219 09:45:30.032844 6333 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:30.032881 6333 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:30.032897 6333 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:30.032908 6333 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:30.032931 6333 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.260007 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.280262 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.294990 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.295043 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.295058 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.295078 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.295094 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:32Z","lastTransitionTime":"2026-02-19T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.297482 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.311637 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.324261 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.398321 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.398380 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.398397 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.398421 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.398437 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:32Z","lastTransitionTime":"2026-02-19T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.443540 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 10:26:39.153434125 +0000 UTC Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.483310 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:32 crc kubenswrapper[4873]: E0219 09:45:32.483807 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.500631 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.500696 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.500717 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.500742 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.500761 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:32Z","lastTransitionTime":"2026-02-19T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.552922 4873 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.571715 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:32 crc kubenswrapper[4873]: E0219 09:45:32.571900 4873 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:32 crc kubenswrapper[4873]: E0219 09:45:32.571986 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs podName:98d35597-056d-48f0-b599-28b098dd45f3 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:33.571967303 +0000 UTC m=+42.861398951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs") pod "network-metrics-daemon-lcp8k" (UID: "98d35597-056d-48f0-b599-28b098dd45f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.603062 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.603217 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.603246 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.603317 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.603345 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:32Z","lastTransitionTime":"2026-02-19T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.706341 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.706404 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.706424 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.706447 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.706464 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:32Z","lastTransitionTime":"2026-02-19T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.809680 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.809747 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.809764 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.809789 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.809813 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:32Z","lastTransitionTime":"2026-02-19T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.913070 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.913227 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.913248 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.913271 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:32 crc kubenswrapper[4873]: I0219 09:45:32.913288 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:32Z","lastTransitionTime":"2026-02-19T09:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.016015 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.016079 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.016137 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.016162 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.016179 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:33Z","lastTransitionTime":"2026-02-19T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.119280 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.119365 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.119390 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.119423 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.119447 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:33Z","lastTransitionTime":"2026-02-19T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.222201 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.222283 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.222309 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.222342 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.222367 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:33Z","lastTransitionTime":"2026-02-19T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.325805 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.325871 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.325889 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.325913 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.325934 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:33Z","lastTransitionTime":"2026-02-19T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.429021 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.429094 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.429139 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.429165 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.429182 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:33Z","lastTransitionTime":"2026-02-19T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.444523 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 06:22:48.181731419 +0000 UTC Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.483131 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:33 crc kubenswrapper[4873]: E0219 09:45:33.483280 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.483354 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.483445 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:33 crc kubenswrapper[4873]: E0219 09:45:33.483589 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:33 crc kubenswrapper[4873]: E0219 09:45:33.483774 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.532535 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.532602 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.532619 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.532642 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.532660 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:33Z","lastTransitionTime":"2026-02-19T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.584727 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:33 crc kubenswrapper[4873]: E0219 09:45:33.584959 4873 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:33 crc kubenswrapper[4873]: E0219 09:45:33.585089 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs podName:98d35597-056d-48f0-b599-28b098dd45f3 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:35.585059574 +0000 UTC m=+44.874491252 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs") pod "network-metrics-daemon-lcp8k" (UID: "98d35597-056d-48f0-b599-28b098dd45f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.635619 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.635682 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.635703 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.635728 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.635745 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:33Z","lastTransitionTime":"2026-02-19T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.739227 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.739313 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.739337 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.739367 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.739390 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:33Z","lastTransitionTime":"2026-02-19T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.842513 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.842556 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.842568 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.842586 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.842598 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:33Z","lastTransitionTime":"2026-02-19T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.945317 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.945389 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.945407 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.945430 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:33 crc kubenswrapper[4873]: I0219 09:45:33.945448 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:33Z","lastTransitionTime":"2026-02-19T09:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.048661 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.048727 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.048742 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.048760 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.048772 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:34Z","lastTransitionTime":"2026-02-19T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.152095 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.152162 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.152173 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.152194 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.152209 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:34Z","lastTransitionTime":"2026-02-19T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.255166 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.255294 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.255335 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.255369 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.255390 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:34Z","lastTransitionTime":"2026-02-19T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.358458 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.358538 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.358577 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.358609 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.358632 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:34Z","lastTransitionTime":"2026-02-19T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.445615 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 03:59:10.577655612 +0000 UTC Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.462359 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.462434 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.462454 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.462481 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.462498 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:34Z","lastTransitionTime":"2026-02-19T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.483722 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:34 crc kubenswrapper[4873]: E0219 09:45:34.483886 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.565046 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.565134 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.565154 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.565180 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.565197 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:34Z","lastTransitionTime":"2026-02-19T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.668647 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.668732 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.668749 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.668776 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.668793 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:34Z","lastTransitionTime":"2026-02-19T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.770867 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.770915 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.770925 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.770939 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.770950 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:34Z","lastTransitionTime":"2026-02-19T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.872811 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.872868 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.872886 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.872912 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.872928 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:34Z","lastTransitionTime":"2026-02-19T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.975286 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.975347 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.975360 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.975377 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:34 crc kubenswrapper[4873]: I0219 09:45:34.975390 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:34Z","lastTransitionTime":"2026-02-19T09:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.077551 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.077593 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.077602 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.077616 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.077625 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:35Z","lastTransitionTime":"2026-02-19T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.216451 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.216508 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.216524 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.216545 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.216559 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:35Z","lastTransitionTime":"2026-02-19T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.319903 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.320020 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.320047 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.320071 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.320087 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:35Z","lastTransitionTime":"2026-02-19T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.423400 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.423475 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.423502 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.423530 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.423553 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:35Z","lastTransitionTime":"2026-02-19T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.446693 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 19:00:13.837380047 +0000 UTC Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.484164 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.484250 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.484269 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:35 crc kubenswrapper[4873]: E0219 09:45:35.484358 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:35 crc kubenswrapper[4873]: E0219 09:45:35.484539 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:35 crc kubenswrapper[4873]: E0219 09:45:35.484699 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.526952 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.527046 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.527082 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.527151 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.527183 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:35Z","lastTransitionTime":"2026-02-19T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.608314 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:35 crc kubenswrapper[4873]: E0219 09:45:35.608720 4873 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:35 crc kubenswrapper[4873]: E0219 09:45:35.608847 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs podName:98d35597-056d-48f0-b599-28b098dd45f3 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:39.608818073 +0000 UTC m=+48.898249751 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs") pod "network-metrics-daemon-lcp8k" (UID: "98d35597-056d-48f0-b599-28b098dd45f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.630656 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.630727 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.630743 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.630766 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.630784 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:35Z","lastTransitionTime":"2026-02-19T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.733654 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.733724 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.733748 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.733775 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.733793 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:35Z","lastTransitionTime":"2026-02-19T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.837331 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.837440 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.837459 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.837487 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.837510 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:35Z","lastTransitionTime":"2026-02-19T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.940824 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.940909 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.940931 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.940962 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:35 crc kubenswrapper[4873]: I0219 09:45:35.940978 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:35Z","lastTransitionTime":"2026-02-19T09:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.044183 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.044264 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.044284 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.044311 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.044333 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:36Z","lastTransitionTime":"2026-02-19T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.147717 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.147790 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.147808 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.147846 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.147865 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:36Z","lastTransitionTime":"2026-02-19T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.251089 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.251183 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.251202 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.251226 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.251247 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:36Z","lastTransitionTime":"2026-02-19T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.354313 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.354385 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.354406 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.354434 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.354452 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:36Z","lastTransitionTime":"2026-02-19T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.447279 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 09:06:26.111353196 +0000 UTC Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.458078 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.458198 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.458220 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.458246 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.458266 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:36Z","lastTransitionTime":"2026-02-19T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.483754 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:36 crc kubenswrapper[4873]: E0219 09:45:36.483933 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.561842 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.561908 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.561928 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.561953 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.561971 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:36Z","lastTransitionTime":"2026-02-19T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.665497 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.665578 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.665602 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.665632 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.665656 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:36Z","lastTransitionTime":"2026-02-19T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.769721 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.769780 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.769797 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.769822 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.769839 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:36Z","lastTransitionTime":"2026-02-19T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.872668 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.872738 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.872757 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.872781 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.872800 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:36Z","lastTransitionTime":"2026-02-19T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.975551 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.975602 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.975621 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.975644 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:36 crc kubenswrapper[4873]: I0219 09:45:36.975662 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:36Z","lastTransitionTime":"2026-02-19T09:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.078349 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.078407 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.078425 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.078448 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.078467 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.181501 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.181573 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.181606 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.181654 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.181675 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.284875 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.284954 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.284978 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.285009 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.285031 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.388081 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.388196 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.388215 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.388241 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.388266 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.448658 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 16:37:00.755689598 +0000 UTC Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.483529 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.483571 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:37 crc kubenswrapper[4873]: E0219 09:45:37.483720 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.483759 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:37 crc kubenswrapper[4873]: E0219 09:45:37.483897 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:37 crc kubenswrapper[4873]: E0219 09:45:37.483991 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.490860 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.490912 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.490932 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.490957 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.490974 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.594633 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.594689 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.594702 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.594720 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.594733 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.620930 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.620999 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.621023 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.621050 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.621068 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: E0219 09:45:37.641337 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:37Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.646463 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.646527 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.646552 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.646584 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.646612 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: E0219 09:45:37.668813 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:37Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.672840 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.672900 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.672923 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.672954 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.672971 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: E0219 09:45:37.696713 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:37Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.701811 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.701855 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.701869 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.701890 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.701906 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: E0219 09:45:37.719640 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:37Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.724273 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.724338 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.724353 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.724377 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.724393 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: E0219 09:45:37.744156 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:37Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:37 crc kubenswrapper[4873]: E0219 09:45:37.744304 4873 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.746491 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.746529 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.746543 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.746564 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.746580 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.851745 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.851801 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.851815 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.851833 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.851845 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.955225 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.955278 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.955290 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.955309 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:37 crc kubenswrapper[4873]: I0219 09:45:37.955322 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:37Z","lastTransitionTime":"2026-02-19T09:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.057985 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.058070 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.058141 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.058176 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.058203 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:38Z","lastTransitionTime":"2026-02-19T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.160869 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.160923 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.160941 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.160965 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.160982 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:38Z","lastTransitionTime":"2026-02-19T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.263487 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.263556 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.263582 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.263612 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.263634 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:38Z","lastTransitionTime":"2026-02-19T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.367003 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.367071 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.367089 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.367145 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.367164 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:38Z","lastTransitionTime":"2026-02-19T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.449537 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 17:03:14.124466165 +0000 UTC Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.469832 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.469928 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.469952 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.469982 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.470006 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:38Z","lastTransitionTime":"2026-02-19T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.483394 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:38 crc kubenswrapper[4873]: E0219 09:45:38.483548 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.573408 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.573454 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.573467 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.573483 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.573495 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:38Z","lastTransitionTime":"2026-02-19T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.675749 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.675817 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.675836 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.675865 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.675887 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:38Z","lastTransitionTime":"2026-02-19T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.779050 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.779094 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.779140 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.779162 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.779179 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:38Z","lastTransitionTime":"2026-02-19T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.882197 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.882255 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.882268 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.882287 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.882299 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:38Z","lastTransitionTime":"2026-02-19T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.985212 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.985258 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.985273 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.985293 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:38 crc kubenswrapper[4873]: I0219 09:45:38.985305 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:38Z","lastTransitionTime":"2026-02-19T09:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.087763 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.087811 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.087823 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.087840 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.087853 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:39Z","lastTransitionTime":"2026-02-19T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.190853 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.190900 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.190913 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.190936 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.190949 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:39Z","lastTransitionTime":"2026-02-19T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.293376 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.293438 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.293456 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.293480 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.293498 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:39Z","lastTransitionTime":"2026-02-19T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.397686 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.397742 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.397760 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.397791 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.397807 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:39Z","lastTransitionTime":"2026-02-19T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.450743 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 12:04:27.206997695 +0000 UTC Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.483619 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.483660 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:39 crc kubenswrapper[4873]: E0219 09:45:39.483845 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.483909 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:39 crc kubenswrapper[4873]: E0219 09:45:39.483980 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:39 crc kubenswrapper[4873]: E0219 09:45:39.484230 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.500676 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.500732 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.500753 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.500778 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.500797 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:39Z","lastTransitionTime":"2026-02-19T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.603472 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.603554 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.603572 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.603596 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.603615 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:39Z","lastTransitionTime":"2026-02-19T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.657731 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:39 crc kubenswrapper[4873]: E0219 09:45:39.657936 4873 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:39 crc kubenswrapper[4873]: E0219 09:45:39.658068 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs podName:98d35597-056d-48f0-b599-28b098dd45f3 nodeName:}" failed. No retries permitted until 2026-02-19 09:45:47.658038444 +0000 UTC m=+56.947470122 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs") pod "network-metrics-daemon-lcp8k" (UID: "98d35597-056d-48f0-b599-28b098dd45f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.707076 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.707184 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.707204 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.707228 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.707246 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:39Z","lastTransitionTime":"2026-02-19T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.811808 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.811894 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.811914 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.811938 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.812016 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:39Z","lastTransitionTime":"2026-02-19T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.915776 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.915825 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.915837 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.915854 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:39 crc kubenswrapper[4873]: I0219 09:45:39.915867 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:39Z","lastTransitionTime":"2026-02-19T09:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.018666 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.018744 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.018764 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.018791 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.018808 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:40Z","lastTransitionTime":"2026-02-19T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.120820 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.120886 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.120905 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.120930 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.120947 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:40Z","lastTransitionTime":"2026-02-19T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.223736 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.223808 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.223826 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.223850 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.223868 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:40Z","lastTransitionTime":"2026-02-19T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.327087 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.327171 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.327191 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.327214 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.327232 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:40Z","lastTransitionTime":"2026-02-19T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.430316 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.430460 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.430480 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.430504 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.430524 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:40Z","lastTransitionTime":"2026-02-19T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.451667 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 13:41:44.285565054 +0000 UTC Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.483444 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:40 crc kubenswrapper[4873]: E0219 09:45:40.483637 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.534403 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.534470 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.534490 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.534515 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.534534 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:40Z","lastTransitionTime":"2026-02-19T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.637561 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.637640 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.637663 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.637690 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.637710 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:40Z","lastTransitionTime":"2026-02-19T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.740975 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.741050 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.741070 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.741095 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.741144 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:40Z","lastTransitionTime":"2026-02-19T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.844405 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.844468 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.844487 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.844511 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.844529 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:40Z","lastTransitionTime":"2026-02-19T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.947559 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.947635 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.947657 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.947684 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:40 crc kubenswrapper[4873]: I0219 09:45:40.947705 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:40Z","lastTransitionTime":"2026-02-19T09:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.050369 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.050420 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.050433 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.050451 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.050463 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:41Z","lastTransitionTime":"2026-02-19T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.153608 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.153671 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.153689 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.153714 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.153733 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:41Z","lastTransitionTime":"2026-02-19T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.256523 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.256579 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.256634 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.256659 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.256676 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:41Z","lastTransitionTime":"2026-02-19T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.360825 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.361259 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.361503 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.361723 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.361909 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:41Z","lastTransitionTime":"2026-02-19T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.452659 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 09:47:29.881706536 +0000 UTC Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.466030 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.466097 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.466148 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.466182 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.466217 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:41Z","lastTransitionTime":"2026-02-19T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.483417 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.483560 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:41 crc kubenswrapper[4873]: E0219 09:45:41.483593 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.483704 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:41 crc kubenswrapper[4873]: E0219 09:45:41.484087 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:41 crc kubenswrapper[4873]: E0219 09:45:41.484323 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.505139 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.530417 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.552239 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.569276 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.569331 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.569347 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.569368 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.569383 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:41Z","lastTransitionTime":"2026-02-19T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.575944 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.596918 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.617659 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.640419 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.666004 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.672433 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.672521 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.672545 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.672600 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.672623 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:41Z","lastTransitionTime":"2026-02-19T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.680320 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.707449 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.732432 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.755958 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.776225 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.776320 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.776338 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.776428 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.776468 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:41Z","lastTransitionTime":"2026-02-19T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.791268 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:30.029768 6333 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0219 09:45:30.032587 6333 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 09:45:30.032616 6333 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 09:45:30.032670 6333 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 09:45:30.032733 6333 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 09:45:30.032745 6333 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:30.032824 6333 factory.go:656] Stopping watch factory\\\\nI0219 09:45:30.032844 6333 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:30.032881 6333 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:30.032897 6333 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:30.032908 6333 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:30.032931 6333 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.808517 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.833924 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.855390 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:41Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.879971 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.880017 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.880034 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.880061 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.880079 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:41Z","lastTransitionTime":"2026-02-19T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.982942 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.983036 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.983060 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.983092 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:41 crc kubenswrapper[4873]: I0219 09:45:41.983188 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:41Z","lastTransitionTime":"2026-02-19T09:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.086308 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.086364 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.086382 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.086409 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.086431 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:42Z","lastTransitionTime":"2026-02-19T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.189151 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.189234 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.189257 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.189280 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.189299 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:42Z","lastTransitionTime":"2026-02-19T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.292631 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.292675 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.292687 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.292704 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.292716 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:42Z","lastTransitionTime":"2026-02-19T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.395684 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.395747 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.395763 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.395784 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.395799 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:42Z","lastTransitionTime":"2026-02-19T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.453845 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 03:53:47.266382958 +0000 UTC Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.483226 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:42 crc kubenswrapper[4873]: E0219 09:45:42.483601 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.498911 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.499041 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.499062 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.499087 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.499132 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:42Z","lastTransitionTime":"2026-02-19T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.601900 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.601958 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.601977 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.602002 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.602020 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:42Z","lastTransitionTime":"2026-02-19T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.705242 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.705308 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.705322 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.705338 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.705348 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:42Z","lastTransitionTime":"2026-02-19T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.808474 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.808514 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.808525 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.808543 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.808556 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:42Z","lastTransitionTime":"2026-02-19T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.910343 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.910387 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.910400 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.910416 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:42 crc kubenswrapper[4873]: I0219 09:45:42.910428 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:42Z","lastTransitionTime":"2026-02-19T09:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.013842 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.013927 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.013952 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.013981 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.014003 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:43Z","lastTransitionTime":"2026-02-19T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.117927 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.117993 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.118010 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.118033 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.118050 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:43Z","lastTransitionTime":"2026-02-19T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.218060 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.218233 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218284 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:46:15.218246506 +0000 UTC m=+84.507678184 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.218339 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218363 4873 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.218402 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218422 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:46:15.2184055 +0000 UTC m=+84.507837168 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.218481 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218649 4873 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218711 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218735 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218754 4873 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218719 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:46:15.218698207 +0000 UTC m=+84.508129885 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218813 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 09:46:15.21879855 +0000 UTC m=+84.508230218 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218899 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218916 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218931 4873 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.218975 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 09:46:15.218962294 +0000 UTC m=+84.508393972 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.222588 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.222638 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.222651 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.222670 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.222685 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:43Z","lastTransitionTime":"2026-02-19T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.325046 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.325129 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.325148 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.325171 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.325187 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:43Z","lastTransitionTime":"2026-02-19T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.440296 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.440358 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.440376 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.440399 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.440416 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:43Z","lastTransitionTime":"2026-02-19T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.455055 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 23:33:14.237046189 +0000 UTC Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.483550 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.483698 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.484052 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.484327 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.484239 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:43 crc kubenswrapper[4873]: E0219 09:45:43.484689 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.544031 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.544418 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.544583 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.544722 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.544860 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:43Z","lastTransitionTime":"2026-02-19T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.649366 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.649440 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.649467 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.649516 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.649562 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:43Z","lastTransitionTime":"2026-02-19T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.753627 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.753742 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.753763 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.753788 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.753805 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:43Z","lastTransitionTime":"2026-02-19T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.857291 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.857396 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.857415 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.857439 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.857460 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:43Z","lastTransitionTime":"2026-02-19T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.961045 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.961152 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.961170 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.961190 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:43 crc kubenswrapper[4873]: I0219 09:45:43.961204 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:43Z","lastTransitionTime":"2026-02-19T09:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.064510 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.064567 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.064585 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.064649 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.064669 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:44Z","lastTransitionTime":"2026-02-19T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.167617 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.167700 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.167728 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.167761 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.167786 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:44Z","lastTransitionTime":"2026-02-19T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.270477 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.270515 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.270525 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.270538 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.270547 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:44Z","lastTransitionTime":"2026-02-19T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.373451 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.373494 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.373506 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.373523 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.373536 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:44Z","lastTransitionTime":"2026-02-19T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.455992 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 21:20:26.276547555 +0000 UTC Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.476582 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.476667 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.476688 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.476712 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.476778 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:44Z","lastTransitionTime":"2026-02-19T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.484302 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:44 crc kubenswrapper[4873]: E0219 09:45:44.484499 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.580814 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.580879 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.580919 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.580956 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.580978 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:44Z","lastTransitionTime":"2026-02-19T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.683868 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.683940 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.683965 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.684002 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.684052 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:44Z","lastTransitionTime":"2026-02-19T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.786967 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.787050 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.787070 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.787096 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.787152 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:44Z","lastTransitionTime":"2026-02-19T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.890494 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.890550 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.890572 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.890646 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.890668 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:44Z","lastTransitionTime":"2026-02-19T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.993811 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.993876 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.993905 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.993937 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:44 crc kubenswrapper[4873]: I0219 09:45:44.993961 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:44Z","lastTransitionTime":"2026-02-19T09:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.096572 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.096636 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.096660 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.096707 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.096726 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:45Z","lastTransitionTime":"2026-02-19T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.199370 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.199440 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.199459 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.199485 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.199507 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:45Z","lastTransitionTime":"2026-02-19T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.302933 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.303062 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.303086 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.303134 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.303158 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:45Z","lastTransitionTime":"2026-02-19T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.406573 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.406644 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.406666 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.406690 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.406710 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:45Z","lastTransitionTime":"2026-02-19T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.456383 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 09:06:08.874490594 +0000 UTC Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.483464 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.483541 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.483501 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:45 crc kubenswrapper[4873]: E0219 09:45:45.483742 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:45 crc kubenswrapper[4873]: E0219 09:45:45.483813 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:45 crc kubenswrapper[4873]: E0219 09:45:45.483887 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.486070 4873 scope.go:117] "RemoveContainer" containerID="37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.509980 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.510039 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.510056 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.510079 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.510097 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:45Z","lastTransitionTime":"2026-02-19T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.613060 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.613819 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.613960 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.614090 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.614254 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:45Z","lastTransitionTime":"2026-02-19T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.717162 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.717212 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.717228 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.717251 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.717265 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:45Z","lastTransitionTime":"2026-02-19T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.820398 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.820464 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.820482 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.820504 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.820522 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:45Z","lastTransitionTime":"2026-02-19T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.843382 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/1.log" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.847146 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91"} Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.847530 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.872997 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:45Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.895426 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:45Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.919722 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:45Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.924743 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.924789 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.924807 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.924831 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.924849 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:45Z","lastTransitionTime":"2026-02-19T09:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.957345 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:30.029768 6333 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0219 09:45:30.032587 6333 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 09:45:30.032616 6333 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 09:45:30.032670 6333 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 09:45:30.032733 6333 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 09:45:30.032745 6333 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:30.032824 6333 factory.go:656] Stopping watch factory\\\\nI0219 09:45:30.032844 6333 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:30.032881 6333 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:30.032897 6333 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:30.032908 6333 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:30.032931 6333 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:45Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.975311 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:45Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:45 crc kubenswrapper[4873]: I0219 09:45:45.998612 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:45Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.018220 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.027474 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.027534 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.027552 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.027574 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.027587 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:46Z","lastTransitionTime":"2026-02-19T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.034802 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.048080 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.060277 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.071520 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.084895 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.092937 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.103858 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.105341 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.121599 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.129035 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.129061 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.129070 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.129084 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.129094 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:46Z","lastTransitionTime":"2026-02-19T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.135881 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.148346 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.161555 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.174502 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.190520 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.203465 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.218247 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.229835 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.231536 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.231566 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.231577 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.231593 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.231604 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:46Z","lastTransitionTime":"2026-02-19T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.243469 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.268185 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:30.029768 6333 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0219 09:45:30.032587 6333 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 09:45:30.032616 6333 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 09:45:30.032670 6333 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 09:45:30.032733 6333 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 09:45:30.032745 6333 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:30.032824 6333 factory.go:656] Stopping watch factory\\\\nI0219 09:45:30.032844 6333 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:30.032881 6333 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:30.032897 6333 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:30.032908 6333 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:30.032931 6333 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.279561 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.292496 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.306952 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.322431 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.333806 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.333835 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.333843 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.333858 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.333869 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:46Z","lastTransitionTime":"2026-02-19T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.334763 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.349457 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.361366 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.377548 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.391786 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.436769 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.436806 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.436817 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.436832 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.436841 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:46Z","lastTransitionTime":"2026-02-19T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.456957 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 22:14:30.156654077 +0000 UTC Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.483730 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:46 crc kubenswrapper[4873]: E0219 09:45:46.483879 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.539713 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.539753 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.539780 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.539796 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.539806 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:46Z","lastTransitionTime":"2026-02-19T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.643178 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.643245 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.643266 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.643292 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.643311 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:46Z","lastTransitionTime":"2026-02-19T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.746185 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.746255 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.746281 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.746312 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.746334 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:46Z","lastTransitionTime":"2026-02-19T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.849541 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.849618 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.849639 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.849671 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.849691 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:46Z","lastTransitionTime":"2026-02-19T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.852086 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/2.log" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.853145 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/1.log" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.857320 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91" exitCode=1 Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.857438 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91"} Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.857508 4873 scope.go:117] "RemoveContainer" containerID="37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.859276 4873 scope.go:117] "RemoveContainer" containerID="5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91" Feb 19 09:45:46 crc kubenswrapper[4873]: E0219 09:45:46.863375 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.884312 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.897252 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.912366 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.933570 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.946334 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.953717 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.953784 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.953818 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.953847 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.953868 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:46Z","lastTransitionTime":"2026-02-19T09:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.959465 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.972062 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:46 crc kubenswrapper[4873]: I0219 09:45:46.990547 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:46Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.005654 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.022569 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.037978 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.052928 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.057142 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.057213 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.057238 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.057271 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.057296 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:47Z","lastTransitionTime":"2026-02-19T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.084703 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37d6b1798b9f98114e3e6cea724b615c9aa0a796cdecbed6e2693d573e20c0e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"message\\\":\\\"311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:30.029768 6333 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0219 09:45:30.032587 6333 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 09:45:30.032616 6333 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 09:45:30.032670 6333 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 09:45:30.032733 6333 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 09:45:30.032745 6333 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:30.032824 6333 factory.go:656] Stopping watch factory\\\\nI0219 09:45:30.032844 6333 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:30.032881 6333 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:30.032897 6333 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:30.032908 6333 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:30.032931 6333 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"message\\\":\\\"mespace event handler 1 for removal\\\\nI0219 09:45:46.434884 6553 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:46.434869 6553 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:46.434907 6553 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:46.434918 6553 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 09:45:46.434925 6553 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:46.434927 6553 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09:45:46.434932 6553 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 09:45:46.434970 6553 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 09:45:46.434920 6553 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:46.434985 6553 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 09:45:46.434982 6553 factory.go:656] Stopping watch factory\\\\nI0219 09:45:46.434994 6553 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 09:45:46.436159 6553 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:46.436204 6553 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 09:45:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.102944 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.123756 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.141478 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.159185 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.159608 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.159647 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.159662 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.159687 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.159702 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:47Z","lastTransitionTime":"2026-02-19T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.262125 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.262201 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.262234 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.262251 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.262263 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:47Z","lastTransitionTime":"2026-02-19T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.365147 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.365214 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.365234 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.365261 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.365278 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:47Z","lastTransitionTime":"2026-02-19T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.457876 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 00:08:11.578741529 +0000 UTC Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.468381 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.468499 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.468519 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.468543 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.468562 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:47Z","lastTransitionTime":"2026-02-19T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.483489 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.483563 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:47 crc kubenswrapper[4873]: E0219 09:45:47.483712 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.483771 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:47 crc kubenswrapper[4873]: E0219 09:45:47.483933 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:47 crc kubenswrapper[4873]: E0219 09:45:47.484084 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.571725 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.571805 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.571831 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.571863 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.571885 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:47Z","lastTransitionTime":"2026-02-19T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.674161 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.674191 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.674201 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.674216 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.674230 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:47Z","lastTransitionTime":"2026-02-19T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.684682 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:47 crc kubenswrapper[4873]: E0219 09:45:47.684832 4873 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:47 crc kubenswrapper[4873]: E0219 09:45:47.684892 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs podName:98d35597-056d-48f0-b599-28b098dd45f3 nodeName:}" failed. No retries permitted until 2026-02-19 09:46:03.68487704 +0000 UTC m=+72.974308688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs") pod "network-metrics-daemon-lcp8k" (UID: "98d35597-056d-48f0-b599-28b098dd45f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.778037 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.778197 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.778219 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.778250 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.778281 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:47Z","lastTransitionTime":"2026-02-19T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.862939 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/2.log" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.867357 4873 scope.go:117] "RemoveContainer" containerID="5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91" Feb 19 09:45:47 crc kubenswrapper[4873]: E0219 09:45:47.867500 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.881754 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.881782 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.881794 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.881807 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.881819 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:47Z","lastTransitionTime":"2026-02-19T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.885381 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.902448 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.918528 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.941391 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.959082 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.980581 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.985378 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.985432 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.985444 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.985462 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:47 crc kubenswrapper[4873]: I0219 09:45:47.985477 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:47Z","lastTransitionTime":"2026-02-19T09:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.001534 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.024387 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.041792 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.058168 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"message\\\":\\\"mespace event handler 1 for removal\\\\nI0219 09:45:46.434884 6553 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:46.434869 6553 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:46.434907 6553 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:46.434918 6553 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 09:45:46.434925 6553 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:46.434927 6553 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09:45:46.434932 6553 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 09:45:46.434970 6553 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 09:45:46.434920 6553 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:46.434985 6553 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 09:45:46.434982 6553 factory.go:656] Stopping watch factory\\\\nI0219 09:45:46.434994 6553 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 09:45:46.436159 6553 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:46.436204 6553 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 09:45:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.075881 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.088494 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.088577 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.088596 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.088631 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.088654 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.098265 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.106059 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.106159 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.106188 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.106222 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.106247 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.117864 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: E0219 09:45:48.128733 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.133184 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.133218 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.133228 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.133246 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.133260 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.133878 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: E0219 09:45:48.151294 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.153543 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.156733 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.156783 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.156800 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.156820 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.156836 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.169068 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: E0219 09:45:48.178327 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.182787 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.182825 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.182838 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.182858 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.182872 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.189005 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: E0219 09:45:48.204332 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.207000 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.209052 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.209132 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.209152 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.209176 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.209195 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: E0219 09:45:48.244481 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: E0219 09:45:48.244897 4873 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.247350 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.247414 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.247438 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.247467 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.247488 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.264417 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.286807 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.299532 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.313552 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.327482 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.341152 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.349517 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.349558 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.349570 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.349588 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.349599 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.365059 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.380235 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.397346 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.411047 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.424418 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.444906 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"message\\\":\\\"mespace event handler 1 for removal\\\\nI0219 09:45:46.434884 6553 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:46.434869 6553 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:46.434907 6553 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:46.434918 6553 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 09:45:46.434925 6553 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:46.434927 6553 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09:45:46.434932 6553 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 09:45:46.434970 6553 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 09:45:46.434920 6553 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:46.434985 6553 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 09:45:46.434982 6553 factory.go:656] Stopping watch factory\\\\nI0219 09:45:46.434994 6553 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 09:45:46.436159 6553 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:46.436204 6553 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 09:45:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.451967 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.452005 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.452017 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.452034 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.452047 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.456749 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.458800 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 13:00:21.103018127 +0000 UTC Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.468632 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.483284 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.483277 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: E0219 09:45:48.483458 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.502276 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.513820 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.555475 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.555799 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.555940 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.556077 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.556280 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.659609 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.659673 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.659691 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.659715 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.659734 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.784864 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.784927 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.784950 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.784978 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.784999 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.887174 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.887217 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.887229 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.887245 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.887257 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.990238 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.990280 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.990295 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.990313 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:48 crc kubenswrapper[4873]: I0219 09:45:48.990327 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:48Z","lastTransitionTime":"2026-02-19T09:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.094461 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.094537 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.094564 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.094597 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.094731 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:49Z","lastTransitionTime":"2026-02-19T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.197762 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.197827 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.197847 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.197874 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.197894 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:49Z","lastTransitionTime":"2026-02-19T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.301934 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.301991 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.302006 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.302025 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.302039 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:49Z","lastTransitionTime":"2026-02-19T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.405167 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.405243 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.405263 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.405288 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.405310 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:49Z","lastTransitionTime":"2026-02-19T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.459386 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 13:04:58.807351495 +0000 UTC Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.484474 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.484586 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:49 crc kubenswrapper[4873]: E0219 09:45:49.484676 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:49 crc kubenswrapper[4873]: E0219 09:45:49.484844 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.484483 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:49 crc kubenswrapper[4873]: E0219 09:45:49.485026 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.507858 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.507905 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.507922 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.507944 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.507962 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:49Z","lastTransitionTime":"2026-02-19T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.611300 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.611365 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.611388 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.611418 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.611441 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:49Z","lastTransitionTime":"2026-02-19T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.714662 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.714699 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.714712 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.714728 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.714738 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:49Z","lastTransitionTime":"2026-02-19T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.817556 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.817629 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.817656 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.817734 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.817761 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:49Z","lastTransitionTime":"2026-02-19T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.921244 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.921313 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.921331 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.921357 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:49 crc kubenswrapper[4873]: I0219 09:45:49.921375 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:49Z","lastTransitionTime":"2026-02-19T09:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.024270 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.024340 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.024357 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.024382 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.024398 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:50Z","lastTransitionTime":"2026-02-19T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.127034 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.127074 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.127086 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.127127 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.127150 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:50Z","lastTransitionTime":"2026-02-19T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.229784 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.229846 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.229865 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.229891 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.229951 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:50Z","lastTransitionTime":"2026-02-19T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.333478 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.333539 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.333557 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.333578 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.333592 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:50Z","lastTransitionTime":"2026-02-19T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.435802 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.435834 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.435846 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.435860 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.435872 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:50Z","lastTransitionTime":"2026-02-19T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.459602 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 21:02:34.816472488 +0000 UTC Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.484041 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:50 crc kubenswrapper[4873]: E0219 09:45:50.484280 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.539311 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.539366 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.539387 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.539411 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.539429 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:50Z","lastTransitionTime":"2026-02-19T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.642630 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.642712 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.642734 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.642762 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.642782 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:50Z","lastTransitionTime":"2026-02-19T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.745341 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.745397 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.745414 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.745437 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.745453 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:50Z","lastTransitionTime":"2026-02-19T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.848277 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.848323 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.848339 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.848361 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.848378 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:50Z","lastTransitionTime":"2026-02-19T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.951781 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.951847 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.951864 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.951886 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:50 crc kubenswrapper[4873]: I0219 09:45:50.951903 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:50Z","lastTransitionTime":"2026-02-19T09:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.054956 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.055010 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.055029 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.055051 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.055067 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:51Z","lastTransitionTime":"2026-02-19T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.157541 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.157603 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.157623 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.157652 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.157676 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:51Z","lastTransitionTime":"2026-02-19T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.260133 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.260173 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.260213 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.260232 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.260272 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:51Z","lastTransitionTime":"2026-02-19T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.368647 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.369221 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.369417 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.369630 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.369797 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:51Z","lastTransitionTime":"2026-02-19T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.460685 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 22:20:18.086299302 +0000 UTC Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.473347 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.473398 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.473417 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.473439 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.473455 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:51Z","lastTransitionTime":"2026-02-19T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.483192 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.483410 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.483377 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:51 crc kubenswrapper[4873]: E0219 09:45:51.483822 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:51 crc kubenswrapper[4873]: E0219 09:45:51.484043 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:51 crc kubenswrapper[4873]: E0219 09:45:51.484217 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.500411 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.514127 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.530264 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.546208 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.560894 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.575857 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.575933 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.575958 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.575991 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.576012 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:51Z","lastTransitionTime":"2026-02-19T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.583211 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.607623 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.628234 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.657700 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"message\\\":\\\"mespace event handler 1 for removal\\\\nI0219 09:45:46.434884 6553 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:46.434869 6553 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:46.434907 6553 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:46.434918 6553 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 09:45:46.434925 6553 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:46.434927 6553 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09:45:46.434932 6553 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 09:45:46.434970 6553 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 09:45:46.434920 6553 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:46.434985 6553 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 09:45:46.434982 6553 factory.go:656] Stopping watch factory\\\\nI0219 09:45:46.434994 6553 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 09:45:46.436159 6553 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:46.436204 6553 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 09:45:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.674173 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.679810 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.680063 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.680261 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.680414 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.680560 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:51Z","lastTransitionTime":"2026-02-19T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.695512 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.712489 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.732077 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.750061 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.765173 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.782931 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.783451 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.783590 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.783711 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.783806 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.783901 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:51Z","lastTransitionTime":"2026-02-19T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.803153 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:51Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.886096 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.886175 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.886194 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.886220 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.886253 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:51Z","lastTransitionTime":"2026-02-19T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.988590 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.988641 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.988658 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.988682 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:51 crc kubenswrapper[4873]: I0219 09:45:51.988699 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:51Z","lastTransitionTime":"2026-02-19T09:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.091965 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.092012 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.092025 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.092042 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.092054 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:52Z","lastTransitionTime":"2026-02-19T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.194907 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.195407 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.195571 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.195722 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.195888 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:52Z","lastTransitionTime":"2026-02-19T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.299786 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.299832 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.299849 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.299873 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.299890 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:52Z","lastTransitionTime":"2026-02-19T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.403069 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.403502 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.403662 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.403819 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.403985 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:52Z","lastTransitionTime":"2026-02-19T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.460874 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 22:30:26.455476064 +0000 UTC Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.483190 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:52 crc kubenswrapper[4873]: E0219 09:45:52.483382 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.507530 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.507572 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.507591 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.507612 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.507629 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:52Z","lastTransitionTime":"2026-02-19T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.611184 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.611253 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.611271 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.611293 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.611312 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:52Z","lastTransitionTime":"2026-02-19T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.714183 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.714237 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.714262 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.714288 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.714307 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:52Z","lastTransitionTime":"2026-02-19T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.817451 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.817516 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.817536 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.817561 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.817579 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:52Z","lastTransitionTime":"2026-02-19T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.920057 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.920498 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.920709 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.920891 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:52 crc kubenswrapper[4873]: I0219 09:45:52.921049 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:52Z","lastTransitionTime":"2026-02-19T09:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.024263 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.024328 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.024346 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.024369 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.024397 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:53Z","lastTransitionTime":"2026-02-19T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.127566 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.127614 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.127625 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.127647 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.127660 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:53Z","lastTransitionTime":"2026-02-19T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.230794 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.230850 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.230862 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.230879 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.230890 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:53Z","lastTransitionTime":"2026-02-19T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.335937 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.335992 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.336008 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.336045 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.336060 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:53Z","lastTransitionTime":"2026-02-19T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.439077 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.439170 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.439189 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.439212 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.439229 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:53Z","lastTransitionTime":"2026-02-19T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.463258 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 07:49:40.297600798 +0000 UTC Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.484389 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.484472 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.484520 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:53 crc kubenswrapper[4873]: E0219 09:45:53.484633 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:53 crc kubenswrapper[4873]: E0219 09:45:53.484757 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:53 crc kubenswrapper[4873]: E0219 09:45:53.484906 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.542489 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.542546 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.542565 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.542589 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.542606 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:53Z","lastTransitionTime":"2026-02-19T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.645810 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.645885 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.645909 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.645940 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.645962 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:53Z","lastTransitionTime":"2026-02-19T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.748602 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.748667 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.748685 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.748708 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.748726 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:53Z","lastTransitionTime":"2026-02-19T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.851390 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.851447 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.851465 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.851488 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.851506 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:53Z","lastTransitionTime":"2026-02-19T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.955060 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.955148 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.955167 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.955191 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:53 crc kubenswrapper[4873]: I0219 09:45:53.955208 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:53Z","lastTransitionTime":"2026-02-19T09:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.057907 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.057947 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.057959 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.057976 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.057987 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:54Z","lastTransitionTime":"2026-02-19T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.160181 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.160223 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.160237 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.160254 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.160264 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:54Z","lastTransitionTime":"2026-02-19T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.262133 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.262159 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.262167 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.262178 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.262187 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:54Z","lastTransitionTime":"2026-02-19T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.363512 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.363535 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.363542 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.363552 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.363559 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:54Z","lastTransitionTime":"2026-02-19T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.464432 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 00:05:16.706537834 +0000 UTC Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.466349 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.466403 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.466424 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.466450 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.466471 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:54Z","lastTransitionTime":"2026-02-19T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.484186 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:54 crc kubenswrapper[4873]: E0219 09:45:54.484434 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.569475 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.569542 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.569565 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.569592 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.569613 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:54Z","lastTransitionTime":"2026-02-19T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.673501 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.673646 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.673668 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.673693 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.673713 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:54Z","lastTransitionTime":"2026-02-19T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.776987 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.777050 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.777070 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.777129 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.777147 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:54Z","lastTransitionTime":"2026-02-19T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.879385 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.879430 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.879441 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.879459 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.879470 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:54Z","lastTransitionTime":"2026-02-19T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.981641 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.981702 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.981720 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.981744 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:54 crc kubenswrapper[4873]: I0219 09:45:54.981765 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:54Z","lastTransitionTime":"2026-02-19T09:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.084605 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.084653 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.084669 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.084689 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.084705 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:55Z","lastTransitionTime":"2026-02-19T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.187050 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.187197 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.187221 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.187243 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.187260 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:55Z","lastTransitionTime":"2026-02-19T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.289369 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.289414 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.289423 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.289436 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.289444 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:55Z","lastTransitionTime":"2026-02-19T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.391958 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.392033 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.392059 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.392088 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.392152 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:55Z","lastTransitionTime":"2026-02-19T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.493457 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 15:29:20.900766694 +0000 UTC Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.494595 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.494628 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:55 crc kubenswrapper[4873]: E0219 09:45:55.494796 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.494814 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:55 crc kubenswrapper[4873]: E0219 09:45:55.494872 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:55 crc kubenswrapper[4873]: E0219 09:45:55.494939 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.497014 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.497069 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.497090 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.497149 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.497173 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:55Z","lastTransitionTime":"2026-02-19T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.600181 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.600259 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.600284 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.600317 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.600342 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:55Z","lastTransitionTime":"2026-02-19T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.704032 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.704098 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.704131 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.704153 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.704176 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:55Z","lastTransitionTime":"2026-02-19T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.806916 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.806954 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.806963 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.806976 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.806985 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:55Z","lastTransitionTime":"2026-02-19T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.909099 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.909191 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.909214 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.909242 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:55 crc kubenswrapper[4873]: I0219 09:45:55.909264 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:55Z","lastTransitionTime":"2026-02-19T09:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.010973 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.011000 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.011008 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.011019 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.011028 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:56Z","lastTransitionTime":"2026-02-19T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.114340 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.114374 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.114386 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.114402 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.114416 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:56Z","lastTransitionTime":"2026-02-19T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.217979 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.218063 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.218084 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.218154 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.218193 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:56Z","lastTransitionTime":"2026-02-19T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.320928 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.321009 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.321033 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.321063 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.321084 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:56Z","lastTransitionTime":"2026-02-19T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.424514 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.424574 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.424591 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.424616 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.424634 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:56Z","lastTransitionTime":"2026-02-19T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.483573 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:56 crc kubenswrapper[4873]: E0219 09:45:56.483803 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.494549 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 21:54:46.192322339 +0000 UTC Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.526569 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.526601 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.526610 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.526623 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.526632 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:56Z","lastTransitionTime":"2026-02-19T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.629337 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.629392 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.629404 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.629419 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.629430 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:56Z","lastTransitionTime":"2026-02-19T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.731962 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.732027 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.732044 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.732070 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.732086 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:56Z","lastTransitionTime":"2026-02-19T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.834626 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.834694 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.834710 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.834731 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.834747 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:56Z","lastTransitionTime":"2026-02-19T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.937848 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.937907 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.937928 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.937954 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:56 crc kubenswrapper[4873]: I0219 09:45:56.937970 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:56Z","lastTransitionTime":"2026-02-19T09:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.041202 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.041344 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.041368 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.041392 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.041409 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:57Z","lastTransitionTime":"2026-02-19T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.144502 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.144552 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.144567 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.144589 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.144603 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:57Z","lastTransitionTime":"2026-02-19T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.246986 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.247045 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.247062 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.247084 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.247131 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:57Z","lastTransitionTime":"2026-02-19T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.355632 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.355704 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.355723 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.355746 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.355766 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:57Z","lastTransitionTime":"2026-02-19T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.458998 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.459087 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.459134 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.459163 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.459182 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:57Z","lastTransitionTime":"2026-02-19T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.483633 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.483665 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:57 crc kubenswrapper[4873]: E0219 09:45:57.483828 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.483890 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:57 crc kubenswrapper[4873]: E0219 09:45:57.484082 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:57 crc kubenswrapper[4873]: E0219 09:45:57.484354 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.494911 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 06:40:49.426082993 +0000 UTC Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.562829 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.562903 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.562924 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.562947 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.563003 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:57Z","lastTransitionTime":"2026-02-19T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.665926 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.665964 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.665972 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.665986 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.666010 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:57Z","lastTransitionTime":"2026-02-19T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.768186 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.768220 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.768231 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.768268 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.768281 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:57Z","lastTransitionTime":"2026-02-19T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.870846 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.870907 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.870925 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.870950 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.870970 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:57Z","lastTransitionTime":"2026-02-19T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.973125 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.973175 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.973192 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.973208 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:57 crc kubenswrapper[4873]: I0219 09:45:57.973218 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:57Z","lastTransitionTime":"2026-02-19T09:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.075441 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.075487 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.075498 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.075517 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.075529 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.177885 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.177925 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.177934 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.177948 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.177959 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.280747 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.280786 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.280795 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.280809 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.280819 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.382988 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.383036 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.383047 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.383065 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.383097 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.484070 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:45:58 crc kubenswrapper[4873]: E0219 09:45:58.484232 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.485473 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.485509 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.485521 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.485536 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.485549 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.495800 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 07:53:20.913026942 +0000 UTC Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.539369 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.539406 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.539417 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.539430 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.539440 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: E0219 09:45:58.551883 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.556433 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.556476 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.556488 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.556505 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.556517 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: E0219 09:45:58.571709 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.575737 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.575852 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.575927 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.575957 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.575975 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: E0219 09:45:58.590363 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.593299 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.593404 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.593431 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.593453 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.593470 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: E0219 09:45:58.612454 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.617662 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.617718 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.617739 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.617769 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.617795 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: E0219 09:45:58.636558 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 09:45:58 crc kubenswrapper[4873]: E0219 09:45:58.636765 4873 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.638987 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.639069 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.639132 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.639155 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.639172 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.741969 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.742015 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.742036 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.742060 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.742077 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.844383 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.844408 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.844417 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.844429 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.844437 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.946174 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.946205 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.946217 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.946232 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:58 crc kubenswrapper[4873]: I0219 09:45:58.946243 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:58Z","lastTransitionTime":"2026-02-19T09:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.048049 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.048086 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.048132 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.048148 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.048156 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:59Z","lastTransitionTime":"2026-02-19T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.150255 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.150296 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.150308 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.150322 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.150333 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:59Z","lastTransitionTime":"2026-02-19T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.251937 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.251971 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.251980 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.251994 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.252008 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:59Z","lastTransitionTime":"2026-02-19T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.354881 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.354984 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.355003 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.355060 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.355078 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:59Z","lastTransitionTime":"2026-02-19T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.459008 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.459045 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.459055 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.459068 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.459078 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:59Z","lastTransitionTime":"2026-02-19T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.483644 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.483792 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:45:59 crc kubenswrapper[4873]: E0219 09:45:59.484048 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.484075 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:45:59 crc kubenswrapper[4873]: E0219 09:45:59.484178 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:45:59 crc kubenswrapper[4873]: E0219 09:45:59.484364 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.495978 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 03:08:09.048603072 +0000 UTC Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.561870 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.561896 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.561905 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.561917 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.561924 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:59Z","lastTransitionTime":"2026-02-19T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.664346 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.664383 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.664394 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.664410 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.664422 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:59Z","lastTransitionTime":"2026-02-19T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.767678 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.767733 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.767744 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.767757 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.767789 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:59Z","lastTransitionTime":"2026-02-19T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.869958 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.870009 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.870022 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.870037 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.870046 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:59Z","lastTransitionTime":"2026-02-19T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.975623 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.975670 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.975680 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.975695 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:45:59 crc kubenswrapper[4873]: I0219 09:45:59.975710 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:45:59Z","lastTransitionTime":"2026-02-19T09:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.077537 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.077589 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.077602 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.077621 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.077634 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:00Z","lastTransitionTime":"2026-02-19T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.179084 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.179150 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.179160 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.179173 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.179182 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:00Z","lastTransitionTime":"2026-02-19T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.281467 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.281505 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.281514 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.281525 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.281534 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:00Z","lastTransitionTime":"2026-02-19T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.383456 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.383489 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.383500 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.383516 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.383527 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:00Z","lastTransitionTime":"2026-02-19T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.483454 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:00 crc kubenswrapper[4873]: E0219 09:46:00.483595 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.485439 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.485501 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.485519 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.485542 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.485559 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:00Z","lastTransitionTime":"2026-02-19T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.496914 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 03:29:23.885835248 +0000 UTC Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.588849 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.588900 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.588914 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.588929 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.588942 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:00Z","lastTransitionTime":"2026-02-19T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.691863 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.691910 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.691922 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.691939 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.691950 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:00Z","lastTransitionTime":"2026-02-19T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.793892 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.793956 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.793966 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.793981 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.793991 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:00Z","lastTransitionTime":"2026-02-19T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.896267 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.896313 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.896327 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.896374 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.896386 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:00Z","lastTransitionTime":"2026-02-19T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.999250 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.999282 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.999291 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.999311 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:00 crc kubenswrapper[4873]: I0219 09:46:00.999319 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:00Z","lastTransitionTime":"2026-02-19T09:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.101953 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.102068 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.102086 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.102142 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.102180 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:01Z","lastTransitionTime":"2026-02-19T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.205320 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.205360 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.205371 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.205388 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.205400 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:01Z","lastTransitionTime":"2026-02-19T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.307968 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.308008 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.308021 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.308038 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.308049 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:01Z","lastTransitionTime":"2026-02-19T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.410887 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.410945 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.410965 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.410987 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.411004 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:01Z","lastTransitionTime":"2026-02-19T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.483469 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:01 crc kubenswrapper[4873]: E0219 09:46:01.483734 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.484541 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.484628 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:01 crc kubenswrapper[4873]: E0219 09:46:01.484685 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:01 crc kubenswrapper[4873]: E0219 09:46:01.484798 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.497093 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 16:06:32.447348871 +0000 UTC Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.497552 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.513000 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.513029 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.513038 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.513052 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.513061 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:01Z","lastTransitionTime":"2026-02-19T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.523392 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"message\\\":\\\"mespace event handler 1 for removal\\\\nI0219 09:45:46.434884 6553 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:46.434869 6553 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:46.434907 6553 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:46.434918 6553 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 09:45:46.434925 6553 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:46.434927 6553 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09:45:46.434932 6553 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 09:45:46.434970 6553 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 09:45:46.434920 6553 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:46.434985 6553 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 09:45:46.434982 6553 factory.go:656] Stopping watch factory\\\\nI0219 09:45:46.434994 6553 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 09:45:46.436159 6553 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:46.436204 6553 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 09:45:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.532704 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.544006 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.560321 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.577407 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.596188 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.607810 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.615359 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.615392 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.615406 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.615424 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.615437 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:01Z","lastTransitionTime":"2026-02-19T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.618870 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.630921 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.641656 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.657267 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.671122 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.688764 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.706515 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.718366 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.718416 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.718433 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.718456 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.718474 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:01Z","lastTransitionTime":"2026-02-19T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.721843 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.735670 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:01Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.820870 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.821198 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.821330 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.821452 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.821595 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:01Z","lastTransitionTime":"2026-02-19T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.923862 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.923904 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.923921 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.923942 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:01 crc kubenswrapper[4873]: I0219 09:46:01.923959 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:01Z","lastTransitionTime":"2026-02-19T09:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.027598 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.027656 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.027680 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.027709 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.027728 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:02Z","lastTransitionTime":"2026-02-19T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.130663 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.130723 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.130737 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.130754 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.130767 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:02Z","lastTransitionTime":"2026-02-19T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.233390 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.233639 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.233702 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.233768 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.233823 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:02Z","lastTransitionTime":"2026-02-19T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.336741 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.336947 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.337038 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.337122 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.337193 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:02Z","lastTransitionTime":"2026-02-19T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.439602 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.439643 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.439654 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.439669 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.439680 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:02Z","lastTransitionTime":"2026-02-19T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.483261 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:02 crc kubenswrapper[4873]: E0219 09:46:02.483638 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.497732 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 01:36:27.556930897 +0000 UTC Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.542816 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.542873 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.542891 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.542919 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.542938 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:02Z","lastTransitionTime":"2026-02-19T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.645896 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.645939 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.645950 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.645966 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.645976 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:02Z","lastTransitionTime":"2026-02-19T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.748365 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.748403 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.748413 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.748428 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.748437 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:02Z","lastTransitionTime":"2026-02-19T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.850534 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.850590 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.850607 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.850631 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.850649 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:02Z","lastTransitionTime":"2026-02-19T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.952721 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.952789 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.952812 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.952835 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:02 crc kubenswrapper[4873]: I0219 09:46:02.952852 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:02Z","lastTransitionTime":"2026-02-19T09:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.055095 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.055154 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.055163 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.055179 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.055189 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:03Z","lastTransitionTime":"2026-02-19T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.158474 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.158527 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.158541 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.158559 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.158572 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:03Z","lastTransitionTime":"2026-02-19T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.260561 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.260617 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.260635 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.260657 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.260674 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:03Z","lastTransitionTime":"2026-02-19T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.362861 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.362922 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.362945 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.362973 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.362994 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:03Z","lastTransitionTime":"2026-02-19T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.465278 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.465350 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.465368 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.465394 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.465410 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:03Z","lastTransitionTime":"2026-02-19T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.483967 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.483970 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.483993 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:03 crc kubenswrapper[4873]: E0219 09:46:03.484222 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:03 crc kubenswrapper[4873]: E0219 09:46:03.484353 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:03 crc kubenswrapper[4873]: E0219 09:46:03.484427 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.485063 4873 scope.go:117] "RemoveContainer" containerID="5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91" Feb 19 09:46:03 crc kubenswrapper[4873]: E0219 09:46:03.485310 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.499258 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 10:55:57.973429486 +0000 UTC Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.568011 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.568068 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.568084 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.568134 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.568152 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:03Z","lastTransitionTime":"2026-02-19T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.671757 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.672062 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.672249 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.672399 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.672541 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:03Z","lastTransitionTime":"2026-02-19T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.774810 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.774842 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.774851 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.774862 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.774872 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:03Z","lastTransitionTime":"2026-02-19T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.781885 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:03 crc kubenswrapper[4873]: E0219 09:46:03.782220 4873 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:46:03 crc kubenswrapper[4873]: E0219 09:46:03.782385 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs podName:98d35597-056d-48f0-b599-28b098dd45f3 nodeName:}" failed. No retries permitted until 2026-02-19 09:46:35.782347014 +0000 UTC m=+105.071778862 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs") pod "network-metrics-daemon-lcp8k" (UID: "98d35597-056d-48f0-b599-28b098dd45f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.877401 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.877470 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.877489 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.877515 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.877533 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:03Z","lastTransitionTime":"2026-02-19T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.980695 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.980765 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.980778 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.980795 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:03 crc kubenswrapper[4873]: I0219 09:46:03.980806 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:03Z","lastTransitionTime":"2026-02-19T09:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.084802 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.084884 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.084905 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.084931 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.084950 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:04Z","lastTransitionTime":"2026-02-19T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.188612 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.188672 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.188689 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.188713 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.188735 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:04Z","lastTransitionTime":"2026-02-19T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.291189 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.291237 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.291250 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.291267 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.291279 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:04Z","lastTransitionTime":"2026-02-19T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.393251 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.393298 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.393311 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.393347 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.393360 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:04Z","lastTransitionTime":"2026-02-19T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.483481 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:04 crc kubenswrapper[4873]: E0219 09:46:04.483754 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.495549 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.495609 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.495627 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.495651 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.495668 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:04Z","lastTransitionTime":"2026-02-19T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.499853 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 22:28:14.126187582 +0000 UTC Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.598509 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.598827 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.598922 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.599014 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.599114 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:04Z","lastTransitionTime":"2026-02-19T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.702267 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.702335 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.702359 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.702389 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.702414 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:04Z","lastTransitionTime":"2026-02-19T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.805462 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.805725 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.805795 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.805888 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.805978 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:04Z","lastTransitionTime":"2026-02-19T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.908782 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.908837 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.908850 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.908869 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.908883 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:04Z","lastTransitionTime":"2026-02-19T09:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.927528 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4pk8x_e1ae3d8d-27cf-489f-a6ba-ef914db74bff/kube-multus/0.log" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.927765 4873 generic.go:334] "Generic (PLEG): container finished" podID="e1ae3d8d-27cf-489f-a6ba-ef914db74bff" containerID="6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003" exitCode=1 Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.927821 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4pk8x" event={"ID":"e1ae3d8d-27cf-489f-a6ba-ef914db74bff","Type":"ContainerDied","Data":"6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003"} Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.928396 4873 scope.go:117] "RemoveContainer" containerID="6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.947254 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:04Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.960162 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:46:04Z\\\",\\\"message\\\":\\\"2026-02-19T09:45:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46\\\\n2026-02-19T09:45:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46 to /host/opt/cni/bin/\\\\n2026-02-19T09:45:19Z [verbose] multus-daemon started\\\\n2026-02-19T09:45:19Z [verbose] Readiness Indicator file check\\\\n2026-02-19T09:46:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:04Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.979989 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:04Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:04 crc kubenswrapper[4873]: I0219 09:46:04.993546 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:04Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.007643 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.017282 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.017329 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.017342 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.017360 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.017372 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:05Z","lastTransitionTime":"2026-02-19T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.022811 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.038724 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.064241 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"message\\\":\\\"mespace event handler 1 for removal\\\\nI0219 09:45:46.434884 6553 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:46.434869 6553 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:46.434907 6553 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:46.434918 6553 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 09:45:46.434925 6553 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:46.434927 6553 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09:45:46.434932 6553 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 09:45:46.434970 6553 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 09:45:46.434920 6553 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:46.434985 6553 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 09:45:46.434982 6553 factory.go:656] Stopping watch factory\\\\nI0219 09:45:46.434994 6553 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 09:45:46.436159 6553 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:46.436204 6553 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 09:45:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.078180 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.097565 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.111307 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.119932 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.119961 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.119969 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.119982 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.119992 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:05Z","lastTransitionTime":"2026-02-19T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.127258 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.137442 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.146553 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.158165 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.167907 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.176813 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.221865 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.221898 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.221909 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.221926 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.221937 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:05Z","lastTransitionTime":"2026-02-19T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.324327 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.324365 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.324377 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.324393 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.324405 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:05Z","lastTransitionTime":"2026-02-19T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.426884 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.426948 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.426967 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.426991 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.427008 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:05Z","lastTransitionTime":"2026-02-19T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.483540 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.483604 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:05 crc kubenswrapper[4873]: E0219 09:46:05.483713 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.483772 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:05 crc kubenswrapper[4873]: E0219 09:46:05.483940 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:05 crc kubenswrapper[4873]: E0219 09:46:05.484016 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.499999 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 08:53:13.366310048 +0000 UTC Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.529703 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.529766 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.529790 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.529822 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.529844 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:05Z","lastTransitionTime":"2026-02-19T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.632930 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.632981 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.632997 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.633019 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.633037 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:05Z","lastTransitionTime":"2026-02-19T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.735618 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.735678 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.735696 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.735719 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.735735 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:05Z","lastTransitionTime":"2026-02-19T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.838823 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.838908 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.838935 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.838968 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.838992 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:05Z","lastTransitionTime":"2026-02-19T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.934152 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4pk8x_e1ae3d8d-27cf-489f-a6ba-ef914db74bff/kube-multus/0.log" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.934215 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4pk8x" event={"ID":"e1ae3d8d-27cf-489f-a6ba-ef914db74bff","Type":"ContainerStarted","Data":"81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc"} Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.940438 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.940469 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.940494 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.940511 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.940520 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:05Z","lastTransitionTime":"2026-02-19T09:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.954008 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.969068 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:05 crc kubenswrapper[4873]: I0219 09:46:05.981802 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:46:04Z\\\",\\\"message\\\":\\\"2026-02-19T09:45:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46\\\\n2026-02-19T09:45:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46 to /host/opt/cni/bin/\\\\n2026-02-19T09:45:19Z [verbose] multus-daemon started\\\\n2026-02-19T09:45:19Z [verbose] Readiness Indicator file check\\\\n2026-02-19T09:46:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:05Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.002837 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.015257 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.031128 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.043465 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.043531 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.043553 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.043582 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.043607 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:06Z","lastTransitionTime":"2026-02-19T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.047044 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.061188 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.087055 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"message\\\":\\\"mespace event handler 1 for removal\\\\nI0219 09:45:46.434884 6553 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:46.434869 6553 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:46.434907 6553 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:46.434918 6553 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 09:45:46.434925 6553 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:46.434927 6553 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09:45:46.434932 6553 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 09:45:46.434970 6553 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 09:45:46.434920 6553 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:46.434985 6553 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 09:45:46.434982 6553 factory.go:656] Stopping watch factory\\\\nI0219 09:45:46.434994 6553 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 09:45:46.436159 6553 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:46.436204 6553 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 09:45:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.100945 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.117717 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.135191 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.146068 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.146130 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.146147 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.146167 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.146185 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:06Z","lastTransitionTime":"2026-02-19T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.152983 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.169319 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.184880 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.199081 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.213280 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.248121 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.248164 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.248176 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.248195 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.248207 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:06Z","lastTransitionTime":"2026-02-19T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.350951 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.351001 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.351013 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.351029 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.351041 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:06Z","lastTransitionTime":"2026-02-19T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.453718 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.453770 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.453788 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.453813 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.453832 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:06Z","lastTransitionTime":"2026-02-19T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.483157 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:06 crc kubenswrapper[4873]: E0219 09:46:06.483333 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.500262 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 17:52:37.170349495 +0000 UTC Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.556609 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.556645 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.556656 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.556671 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.556681 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:06Z","lastTransitionTime":"2026-02-19T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.658994 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.659067 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.659086 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.659138 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.659156 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:06Z","lastTransitionTime":"2026-02-19T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.762388 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.762439 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.762459 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.762482 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.762498 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:06Z","lastTransitionTime":"2026-02-19T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.865229 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.865281 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.865291 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.865305 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.865316 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:06Z","lastTransitionTime":"2026-02-19T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.968039 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.968231 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.968257 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.968282 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:06 crc kubenswrapper[4873]: I0219 09:46:06.968301 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:06Z","lastTransitionTime":"2026-02-19T09:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.070710 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.070822 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.070849 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.070879 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.070898 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:07Z","lastTransitionTime":"2026-02-19T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.173960 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.174040 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.174066 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.174095 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.174142 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:07Z","lastTransitionTime":"2026-02-19T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.277500 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.277556 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.277573 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.277596 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.277613 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:07Z","lastTransitionTime":"2026-02-19T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.380821 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.380859 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.380870 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.380885 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.380895 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:07Z","lastTransitionTime":"2026-02-19T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.483718 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.483794 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:07 crc kubenswrapper[4873]: E0219 09:46:07.483914 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:07 crc kubenswrapper[4873]: E0219 09:46:07.484127 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.484223 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.484541 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.484592 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.484610 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.484651 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.484692 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:07Z","lastTransitionTime":"2026-02-19T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:07 crc kubenswrapper[4873]: E0219 09:46:07.484314 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.500719 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 14:11:26.089371297 +0000 UTC Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.587364 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.587441 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.587462 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.587491 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.587512 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:07Z","lastTransitionTime":"2026-02-19T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.689881 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.689944 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.689961 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.689989 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.690005 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:07Z","lastTransitionTime":"2026-02-19T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.792799 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.792946 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.792974 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.793003 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.793028 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:07Z","lastTransitionTime":"2026-02-19T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.896613 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.896675 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.896701 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.896732 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:07 crc kubenswrapper[4873]: I0219 09:46:07.896814 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:07Z","lastTransitionTime":"2026-02-19T09:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.000225 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.000323 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.000349 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.000380 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.000406 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.104266 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.104335 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.104357 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.104383 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.104402 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.208023 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.208090 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.208174 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.208206 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.208231 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.313037 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.313142 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.313163 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.313192 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.313214 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.416446 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.416500 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.416518 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.416543 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.416561 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.484014 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:08 crc kubenswrapper[4873]: E0219 09:46:08.484244 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.501303 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 09:23:02.926232579 +0000 UTC Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.501407 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.520030 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.520150 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.520192 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.520223 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.520249 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.623637 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.623704 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.623733 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.623762 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.623784 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.727317 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.727379 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.727396 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.727421 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.727440 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.830047 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.830155 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.830176 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.830197 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.830212 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.831402 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.831441 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.831455 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.831471 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.831483 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: E0219 09:46:08.845615 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.850183 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.850234 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.850247 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.850264 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.850277 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: E0219 09:46:08.865085 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.869371 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.869410 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.869421 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.869437 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.869450 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: E0219 09:46:08.887912 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.892136 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.892166 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.892177 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.892190 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.892201 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: E0219 09:46:08.908520 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.912131 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.912158 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.912166 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.912177 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.912186 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:08 crc kubenswrapper[4873]: E0219 09:46:08.925055 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:08 crc kubenswrapper[4873]: E0219 09:46:08.925181 4873 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.933003 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.933051 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.933144 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.933173 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:08 crc kubenswrapper[4873]: I0219 09:46:08.933194 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:08Z","lastTransitionTime":"2026-02-19T09:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.037289 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.037358 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.037377 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.037402 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.037420 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:09Z","lastTransitionTime":"2026-02-19T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.140535 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.140615 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.140651 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.140680 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.140703 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:09Z","lastTransitionTime":"2026-02-19T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.244309 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.244394 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.244418 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.244447 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.244469 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:09Z","lastTransitionTime":"2026-02-19T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.348278 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.348380 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.348399 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.348420 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.348437 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:09Z","lastTransitionTime":"2026-02-19T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.451807 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.451856 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.451872 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.451894 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.451910 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:09Z","lastTransitionTime":"2026-02-19T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.483667 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:09 crc kubenswrapper[4873]: E0219 09:46:09.483807 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.483667 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.483861 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:09 crc kubenswrapper[4873]: E0219 09:46:09.483914 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:09 crc kubenswrapper[4873]: E0219 09:46:09.484055 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.501413 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 18:52:06.309609187 +0000 UTC Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.554506 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.554565 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.554584 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.554607 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.554625 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:09Z","lastTransitionTime":"2026-02-19T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.657834 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.658188 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.658361 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.658521 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.658668 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:09Z","lastTransitionTime":"2026-02-19T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.762521 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.762949 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.763205 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.763423 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.763603 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:09Z","lastTransitionTime":"2026-02-19T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.866896 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.866999 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.867018 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.867076 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.867099 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:09Z","lastTransitionTime":"2026-02-19T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.970481 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.970545 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.970569 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.970598 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:09 crc kubenswrapper[4873]: I0219 09:46:09.970622 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:09Z","lastTransitionTime":"2026-02-19T09:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.074135 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.074208 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.074231 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.074262 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.074284 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:10Z","lastTransitionTime":"2026-02-19T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.177466 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.177549 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.177567 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.177625 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.177644 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:10Z","lastTransitionTime":"2026-02-19T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.280680 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.280781 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.280800 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.280825 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.280843 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:10Z","lastTransitionTime":"2026-02-19T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.384162 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.384292 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.384384 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.384524 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.384600 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:10Z","lastTransitionTime":"2026-02-19T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.484071 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:10 crc kubenswrapper[4873]: E0219 09:46:10.484318 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.487650 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.487696 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.487709 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.487731 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.487746 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:10Z","lastTransitionTime":"2026-02-19T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.501835 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 19:10:00.93981824 +0000 UTC Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.590785 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.590842 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.590872 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.590895 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.590916 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:10Z","lastTransitionTime":"2026-02-19T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.693601 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.693655 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.693671 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.693700 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.693717 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:10Z","lastTransitionTime":"2026-02-19T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.797147 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.797183 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.797194 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.797209 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.797220 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:10Z","lastTransitionTime":"2026-02-19T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.900531 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.900598 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.900616 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.900641 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:10 crc kubenswrapper[4873]: I0219 09:46:10.900657 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:10Z","lastTransitionTime":"2026-02-19T09:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.004221 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.004277 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.004307 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.004333 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.004351 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:11Z","lastTransitionTime":"2026-02-19T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.108045 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.108271 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.108304 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.108328 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.108346 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:11Z","lastTransitionTime":"2026-02-19T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.211013 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.211074 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.211093 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.211185 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.211205 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:11Z","lastTransitionTime":"2026-02-19T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.314093 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.314180 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.314200 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.314226 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.314244 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:11Z","lastTransitionTime":"2026-02-19T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.416753 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.416829 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.416846 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.416872 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.416889 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:11Z","lastTransitionTime":"2026-02-19T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.483848 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.483947 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.483868 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:11 crc kubenswrapper[4873]: E0219 09:46:11.484045 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:11 crc kubenswrapper[4873]: E0219 09:46:11.484228 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:11 crc kubenswrapper[4873]: E0219 09:46:11.484374 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.502303 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 20:59:35.786869788 +0000 UTC Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.505180 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.520886 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.520934 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.520955 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.520986 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.521010 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:11Z","lastTransitionTime":"2026-02-19T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.523869 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.543610 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.564021 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:46:04Z\\\",\\\"message\\\":\\\"2026-02-19T09:45:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46\\\\n2026-02-19T09:45:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46 to /host/opt/cni/bin/\\\\n2026-02-19T09:45:19Z [verbose] multus-daemon started\\\\n2026-02-19T09:45:19Z [verbose] Readiness Indicator file check\\\\n2026-02-19T09:46:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.588339 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.607447 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.623796 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.623853 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.623870 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.623894 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.623911 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:11Z","lastTransitionTime":"2026-02-19T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.628675 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.649540 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.669771 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.701783 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"message\\\":\\\"mespace event handler 1 for removal\\\\nI0219 09:45:46.434884 6553 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:46.434869 6553 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:46.434907 6553 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:46.434918 6553 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 09:45:46.434925 6553 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:46.434927 6553 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09:45:46.434932 6553 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 09:45:46.434970 6553 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 09:45:46.434920 6553 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:46.434985 6553 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 09:45:46.434982 6553 factory.go:656] Stopping watch factory\\\\nI0219 09:45:46.434994 6553 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 09:45:46.436159 6553 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:46.436204 6553 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 09:45:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.719427 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.727191 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.727245 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.727263 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.727287 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.727304 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:11Z","lastTransitionTime":"2026-02-19T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.743850 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.763781 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.784952 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.804017 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.820195 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.830398 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.830488 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.830506 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.830530 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.830547 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:11Z","lastTransitionTime":"2026-02-19T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.836524 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee7c8560-6ad7-4360-9a1d-3137133ef615\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610f40cd2f97208fe9d6c9f3accc665ff4d1276e227f799b64e86b02be681a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83285abf13435688b3341f479e81eb617eacf5bb70ac2ea2c9218d9d9d313b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83285abf13435688b3341f479e81eb617eacf5bb70ac2ea2c9218d9d9d313b9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.855735 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:11Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.933814 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.933903 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.933998 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.934075 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:11 crc kubenswrapper[4873]: I0219 09:46:11.934154 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:11Z","lastTransitionTime":"2026-02-19T09:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.036577 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.036904 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.036916 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.036930 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.036961 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:12Z","lastTransitionTime":"2026-02-19T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.139747 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.139881 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.139912 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.139939 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.139960 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:12Z","lastTransitionTime":"2026-02-19T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.244522 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.244591 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.244616 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.244645 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.244667 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:12Z","lastTransitionTime":"2026-02-19T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.348086 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.348198 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.348222 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.348251 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.348274 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:12Z","lastTransitionTime":"2026-02-19T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.451137 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.451198 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.451215 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.451237 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.451254 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:12Z","lastTransitionTime":"2026-02-19T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.483278 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:12 crc kubenswrapper[4873]: E0219 09:46:12.483446 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.503415 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 09:39:04.683738931 +0000 UTC Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.554916 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.554977 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.554995 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.555020 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.555040 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:12Z","lastTransitionTime":"2026-02-19T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.658092 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.658208 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.658229 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.658257 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.658274 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:12Z","lastTransitionTime":"2026-02-19T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.761294 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.761351 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.761369 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.761394 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.761413 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:12Z","lastTransitionTime":"2026-02-19T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.864060 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.864190 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.864212 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.864236 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.864254 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:12Z","lastTransitionTime":"2026-02-19T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.966663 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.966729 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.966749 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.966773 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:12 crc kubenswrapper[4873]: I0219 09:46:12.966795 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:12Z","lastTransitionTime":"2026-02-19T09:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.072267 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.072331 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.072350 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.072375 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.072392 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:13Z","lastTransitionTime":"2026-02-19T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.175516 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.175607 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.175627 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.175651 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.175668 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:13Z","lastTransitionTime":"2026-02-19T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.278238 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.278272 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.278281 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.278295 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.278304 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:13Z","lastTransitionTime":"2026-02-19T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.381786 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.381859 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.381877 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.381901 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.381918 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:13Z","lastTransitionTime":"2026-02-19T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.483325 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.483330 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:13 crc kubenswrapper[4873]: E0219 09:46:13.483508 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.483354 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:13 crc kubenswrapper[4873]: E0219 09:46:13.483639 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:13 crc kubenswrapper[4873]: E0219 09:46:13.483870 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.484812 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.484867 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.484884 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.484908 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.484927 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:13Z","lastTransitionTime":"2026-02-19T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.504024 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 23:08:52.231053931 +0000 UTC Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.588701 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.588767 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.588784 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.588807 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.588824 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:13Z","lastTransitionTime":"2026-02-19T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.693585 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.693643 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.693666 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.693693 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.693714 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:13Z","lastTransitionTime":"2026-02-19T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.796631 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.796716 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.796740 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.796772 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.796795 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:13Z","lastTransitionTime":"2026-02-19T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.900594 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.900673 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.900710 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.900740 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:13 crc kubenswrapper[4873]: I0219 09:46:13.900761 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:13Z","lastTransitionTime":"2026-02-19T09:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.003296 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.003338 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.003351 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.003366 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.003379 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:14Z","lastTransitionTime":"2026-02-19T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.105554 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.105605 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.105615 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.105628 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.105660 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:14Z","lastTransitionTime":"2026-02-19T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.208820 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.208868 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.208884 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.208906 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.208925 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:14Z","lastTransitionTime":"2026-02-19T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.312498 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.312567 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.312590 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.312621 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.312643 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:14Z","lastTransitionTime":"2026-02-19T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.415432 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.415552 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.415581 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.415609 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.415632 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:14Z","lastTransitionTime":"2026-02-19T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.483387 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:14 crc kubenswrapper[4873]: E0219 09:46:14.483869 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.504704 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 04:02:42.24266388 +0000 UTC Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.509870 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.523742 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.523999 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.524027 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.524060 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.524083 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:14Z","lastTransitionTime":"2026-02-19T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.626716 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.626768 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.626788 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.626981 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.626998 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:14Z","lastTransitionTime":"2026-02-19T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.730417 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.730471 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.730489 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.730514 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.730535 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:14Z","lastTransitionTime":"2026-02-19T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.833831 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.833899 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.833926 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.833956 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.833976 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:14Z","lastTransitionTime":"2026-02-19T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.937160 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.937250 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.937273 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.937371 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:14 crc kubenswrapper[4873]: I0219 09:46:14.937394 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:14Z","lastTransitionTime":"2026-02-19T09:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.040630 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.040681 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.040697 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.040719 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.040735 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:15Z","lastTransitionTime":"2026-02-19T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.144037 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.144163 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.144190 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.144221 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.144240 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:15Z","lastTransitionTime":"2026-02-19T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.246658 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.246717 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.246735 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.246757 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.246773 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:15Z","lastTransitionTime":"2026-02-19T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.316527 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.316685 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.316711 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.316674779 +0000 UTC m=+148.606106427 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.316757 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.316809 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.316850 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.316856 4873 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.316891 4873 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.316941 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.316930565 +0000 UTC m=+148.606362213 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.316960 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.316951525 +0000 UTC m=+148.606383173 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.316982 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.317030 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.317063 4873 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.317078 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.317199 4873 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.317236 4873 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.317200 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.31716659 +0000 UTC m=+148.606598268 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.317343 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.317323024 +0000 UTC m=+148.606754702 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.350304 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.350404 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.350428 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.350453 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.350471 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:15Z","lastTransitionTime":"2026-02-19T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.453920 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.453974 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.453993 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.454016 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.454035 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:15Z","lastTransitionTime":"2026-02-19T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.484402 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.484473 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.484473 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.484604 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.484805 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:15 crc kubenswrapper[4873]: E0219 09:46:15.484914 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.505635 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 13:57:15.324700034 +0000 UTC Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.556900 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.556967 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.556984 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.557009 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.557026 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:15Z","lastTransitionTime":"2026-02-19T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.659592 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.659631 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.659643 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.659658 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.659670 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:15Z","lastTransitionTime":"2026-02-19T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.762150 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.762203 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.762221 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.762245 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.762261 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:15Z","lastTransitionTime":"2026-02-19T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.865482 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.865535 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.865552 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.865576 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.865593 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:15Z","lastTransitionTime":"2026-02-19T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.967736 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.967967 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.967981 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.967998 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:15 crc kubenswrapper[4873]: I0219 09:46:15.968012 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:15Z","lastTransitionTime":"2026-02-19T09:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.070801 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.070863 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.070880 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.070904 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.070921 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:16Z","lastTransitionTime":"2026-02-19T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.174408 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.174442 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.174454 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.174468 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.174477 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:16Z","lastTransitionTime":"2026-02-19T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.277685 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.277741 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.277757 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.277784 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.277804 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:16Z","lastTransitionTime":"2026-02-19T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.381533 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.381591 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.381608 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.381638 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.381655 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:16Z","lastTransitionTime":"2026-02-19T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.483087 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:16 crc kubenswrapper[4873]: E0219 09:46:16.483309 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.485270 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.485323 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.485341 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.485366 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.485385 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:16Z","lastTransitionTime":"2026-02-19T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.506080 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 02:32:13.175358078 +0000 UTC Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.589382 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.589439 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.589461 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.589490 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.589510 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:16Z","lastTransitionTime":"2026-02-19T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.692946 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.693015 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.693038 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.693067 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.693089 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:16Z","lastTransitionTime":"2026-02-19T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.795901 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.795956 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.795977 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.796006 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.796029 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:16Z","lastTransitionTime":"2026-02-19T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.899073 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.899176 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.899205 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.899233 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:16 crc kubenswrapper[4873]: I0219 09:46:16.899253 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:16Z","lastTransitionTime":"2026-02-19T09:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.002084 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.002186 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.002215 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.002251 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.002287 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:17Z","lastTransitionTime":"2026-02-19T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.105600 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.105670 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.105698 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.105734 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.105756 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:17Z","lastTransitionTime":"2026-02-19T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.209803 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.209900 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.209921 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.209977 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.209997 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:17Z","lastTransitionTime":"2026-02-19T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.312959 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.313370 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.313531 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.313671 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.313801 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:17Z","lastTransitionTime":"2026-02-19T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.417414 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.417918 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.418068 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.418280 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.418448 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:17Z","lastTransitionTime":"2026-02-19T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.483760 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.483778 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.483760 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:17 crc kubenswrapper[4873]: E0219 09:46:17.484195 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:17 crc kubenswrapper[4873]: E0219 09:46:17.485183 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:17 crc kubenswrapper[4873]: E0219 09:46:17.485401 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.485965 4873 scope.go:117] "RemoveContainer" containerID="5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.506900 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 13:59:50.974744898 +0000 UTC Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.521497 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.521554 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.521579 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.521610 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.521631 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:17Z","lastTransitionTime":"2026-02-19T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.626050 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.626183 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.626204 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.626260 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.626280 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:17Z","lastTransitionTime":"2026-02-19T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.728542 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.728597 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.728615 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.728638 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.728655 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:17Z","lastTransitionTime":"2026-02-19T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.831353 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.831415 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.831433 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.831459 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.831477 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:17Z","lastTransitionTime":"2026-02-19T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.933767 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.933798 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.933810 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.933827 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.933838 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:17Z","lastTransitionTime":"2026-02-19T09:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.978923 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/2.log" Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.981732 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308"} Feb 19 09:46:17 crc kubenswrapper[4873]: I0219 09:46:17.982844 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.002774 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.034142 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"message\\\":\\\"mespace event handler 1 for removal\\\\nI0219 09:45:46.434884 6553 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:46.434869 6553 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:46.434907 6553 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:46.434918 6553 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 09:45:46.434925 6553 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:46.434927 6553 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09:45:46.434932 6553 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 09:45:46.434970 6553 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 09:45:46.434920 6553 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:46.434985 6553 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 09:45:46.434982 6553 factory.go:656] Stopping watch factory\\\\nI0219 09:45:46.434994 6553 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 09:45:46.436159 6553 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:46.436204 6553 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 09:45:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.040337 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.040378 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.040391 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.040409 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.040422 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:18Z","lastTransitionTime":"2026-02-19T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.050470 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.071263 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.086950 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.108470 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.127755 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.142814 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.142882 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.142898 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.142918 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.142933 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:18Z","lastTransitionTime":"2026-02-19T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.145085 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.161559 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee7c8560-6ad7-4360-9a1d-3137133ef615\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610f40cd2f97208fe9d6c9f3accc665ff4d1276e227f799b64e86b02be681a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83285abf13435688b3341f479e81eb617eacf5bb70ac2ea2c9218d9d9d313b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83285abf13435688b3341f479e81eb617eacf5bb70ac2ea2c9218d9d9d313b9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.182388 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.198648 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.215733 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.242149 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8712bdb-10cc-4cf7-aa21-7f34761ec6cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b37236dcd27e2d6048ee4976e2f411df9eaf77c3bff9f07f06243e70b01cb63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72381fa08fdd672973e1c6e175291b3d89a8f375117c23c7efc939fead3f62ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61ec168f9dd0b793cb746abec5fd0922cbabf69b71ae55d81c0523be8fac9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95584c308545e3ea96b9d6bd6712f4e8d355baeae70edf62e3499a0e984a13af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e2155b0ec1873858607c3d52b287a5533d1bbf707b7a39b4ca93e73177b2b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfd368c8a34e43275736777dadfbca8025018085d520e2d7ec2bac159e23eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfd368c8a34e43275736777dadfbca8025018085d520e2d7ec2bac159e23eb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc75e6f1c2480daa5a86dd017586fdda8ff44989f50d245bb77861ca6f575d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bc75e6f1c2480daa5a86dd017586fdda8ff44989f50d245bb77861ca6f575d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8e936da2913c3886d6abaf2e5e1a4b2e9cd3bc3f2d9c613258ac1437ce749673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e936da2913c3886d6abaf2e5e1a4b2e9cd3bc3f2d9c613258ac1437ce749673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.245978 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.246028 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.246045 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.246082 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.246097 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:18Z","lastTransitionTime":"2026-02-19T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.259187 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.279417 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:46:04Z\\\",\\\"message\\\":\\\"2026-02-19T09:45:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46\\\\n2026-02-19T09:45:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46 to /host/opt/cni/bin/\\\\n2026-02-19T09:45:19Z [verbose] multus-daemon started\\\\n2026-02-19T09:45:19Z [verbose] Readiness Indicator file check\\\\n2026-02-19T09:46:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.298384 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.315478 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.336072 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.349735 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.349786 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.349798 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.349817 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.349828 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:18Z","lastTransitionTime":"2026-02-19T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.358626 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.453015 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.453078 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.453097 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.453151 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.453170 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:18Z","lastTransitionTime":"2026-02-19T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.483649 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:18 crc kubenswrapper[4873]: E0219 09:46:18.483828 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.507965 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 05:35:13.944160084 +0000 UTC Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.556632 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.556675 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.556691 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.556735 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.556751 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:18Z","lastTransitionTime":"2026-02-19T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.659835 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.659897 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.659916 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.659941 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.659959 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:18Z","lastTransitionTime":"2026-02-19T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.763269 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.763331 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.763351 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.763379 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.763400 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:18Z","lastTransitionTime":"2026-02-19T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.866861 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.866921 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.866940 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.866963 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.866983 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:18Z","lastTransitionTime":"2026-02-19T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.970495 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.970566 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.970584 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.970610 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.970626 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:18Z","lastTransitionTime":"2026-02-19T09:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.988261 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/3.log" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.989420 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/2.log" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.996886 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308" exitCode=1 Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.997017 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308"} Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.997154 4873 scope.go:117] "RemoveContainer" containerID="5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91" Feb 19 09:46:18 crc kubenswrapper[4873]: I0219 09:46:18.998284 4873 scope.go:117] "RemoveContainer" containerID="e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308" Feb 19 09:46:18 crc kubenswrapper[4873]: E0219 09:46:18.998648 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.039476 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8712bdb-10cc-4cf7-aa21-7f34761ec6cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b37236dcd27e2d6048ee4976e2f411df9eaf77c3bff9f07f06243e70b01cb63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72381fa08fdd672973e1c6e175291b3d89a8f375117c23c7efc939fead3f62ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61ec168f9dd0b793cb746abec5fd0922cbabf69b71ae55d81c0523be8fac9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95584c308545e3ea96b9d6bd6712f4e8d355baeae70edf62e3499a0e984a13af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e2155b0ec1873858607c3d52b287a5533d1bbf707b7a39b4ca93e73177b2b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfd368c8a34e43275736777dadfbca8025018085d520e2d7ec2bac159e23eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfd368c8a34e43275736777dadfbca8025018085d520e2d7ec2bac159e23eb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc75e6f1c2480daa5a86dd017586fdda8ff44989f50d245bb77861ca6f575d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bc75e6f1c2480daa5a86dd017586fdda8ff44989f50d245bb77861ca6f575d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8e936da2913c3886d6abaf2e5e1a4b2e9cd3bc3f2d9c613258ac1437ce749673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e936da2913c3886d6abaf2e5e1a4b2e9cd3bc3f2d9c613258ac1437ce749673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.059872 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.072822 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.072882 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.072906 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.072936 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.072958 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.076638 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.094366 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.112501 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.112575 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.112593 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.112618 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.112635 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.115312 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: E0219 09:46:19.135796 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.140847 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.140912 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.140936 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.140966 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.140988 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.141136 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: E0219 09:46:19.164046 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.166212 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:46:04Z\\\",\\\"message\\\":\\\"2026-02-19T09:45:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46\\\\n2026-02-19T09:45:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46 to /host/opt/cni/bin/\\\\n2026-02-19T09:45:19Z [verbose] multus-daemon started\\\\n2026-02-19T09:45:19Z [verbose] Readiness Indicator file check\\\\n2026-02-19T09:46:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.172861 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.172920 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.172941 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.172964 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.172981 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: E0219 09:46:19.190473 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.194268 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.195272 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.195311 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.195323 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.195340 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.195354 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.210863 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: E0219 09:46:19.219623 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.224906 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.224961 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.224981 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.225004 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.225022 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.228852 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: E0219 09:46:19.243895 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c923eae3-7568-4314-b0e0-48838f6e14fe\\\",\\\"systemUUID\\\":\\\"9f3a4afb-9582-465c-ace5-f370996d8eea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: E0219 09:46:19.244172 4873 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.246272 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.246327 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.246344 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.246373 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.246393 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.248030 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.271329 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.301835 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cadbc18d55dbcd16c2feb411be593e571abf164f466518982ee8271c4d3cc91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"message\\\":\\\"mespace event handler 1 for removal\\\\nI0219 09:45:46.434884 6553 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 09:45:46.434869 6553 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 09:45:46.434907 6553 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 09:45:46.434918 6553 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 09:45:46.434925 6553 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 09:45:46.434927 6553 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 09:45:46.434932 6553 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 09:45:46.434970 6553 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 09:45:46.434920 6553 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 09:45:46.434985 6553 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 09:45:46.434982 6553 factory.go:656] Stopping watch factory\\\\nI0219 09:45:46.434994 6553 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 09:45:46.436159 6553 ovnkube.go:599] Stopped ovnkube\\\\nI0219 09:45:46.436204 6553 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 09:45:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:46:18Z\\\",\\\"message\\\":\\\"alName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0219 09:46:18.424282 6989 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nF0219 09:46:18.424294 6989 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.318824 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.333818 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee7c8560-6ad7-4360-9a1d-3137133ef615\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610f40cd2f97208fe9d6c9f3accc665ff4d1276e227f799b64e86b02be681a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83285abf13435688b3341f479e81eb617eacf5bb70ac2ea2c9218d9d9d313b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83285abf13435688b3341f479e81eb617eacf5bb70ac2ea2c9218d9d9d313b9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.350293 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.350350 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.350368 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.350392 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.350409 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.352532 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.371416 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.390295 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.408774 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.453648 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.453711 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.453731 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.453754 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.453772 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.484094 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:19 crc kubenswrapper[4873]: E0219 09:46:19.484349 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.484721 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:19 crc kubenswrapper[4873]: E0219 09:46:19.484878 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.485243 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:19 crc kubenswrapper[4873]: E0219 09:46:19.485367 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.508519 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 03:26:12.613388434 +0000 UTC Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.557093 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.557213 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.557270 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.557309 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.557333 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.660148 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.660218 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.660243 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.660267 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.660284 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.762986 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.763058 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.763074 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.763098 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.763142 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.866779 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.866838 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.866854 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.866877 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.866896 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.969763 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.969835 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.969850 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.969872 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:19 crc kubenswrapper[4873]: I0219 09:46:19.969886 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:19Z","lastTransitionTime":"2026-02-19T09:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.003712 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/3.log" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.009735 4873 scope.go:117] "RemoveContainer" containerID="e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308" Feb 19 09:46:20 crc kubenswrapper[4873]: E0219 09:46:20.009991 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.032603 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.055562 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.074287 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.074362 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.074385 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.074416 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.074437 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:20Z","lastTransitionTime":"2026-02-19T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.076860 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.098217 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:46:04Z\\\",\\\"message\\\":\\\"2026-02-19T09:45:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46\\\\n2026-02-19T09:45:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46 to /host/opt/cni/bin/\\\\n2026-02-19T09:45:19Z [verbose] multus-daemon started\\\\n2026-02-19T09:45:19Z [verbose] Readiness Indicator file check\\\\n2026-02-19T09:46:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.126550 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.146243 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.173947 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.177151 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.177218 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.177236 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.177266 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.177285 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:20Z","lastTransitionTime":"2026-02-19T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.196075 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.219323 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.251454 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:46:18Z\\\",\\\"message\\\":\\\"alName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0219 09:46:18.424282 6989 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nF0219 09:46:18.424294 6989 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:46:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.271566 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.280272 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.280325 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.280344 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.280366 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.280383 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:20Z","lastTransitionTime":"2026-02-19T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.289183 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee7c8560-6ad7-4360-9a1d-3137133ef615\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610f40cd2f97208fe9d6c9f3accc665ff4d1276e227f799b64e86b02be681a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83285abf13435688b3341f479e81eb617eacf5bb70ac2ea2c9218d9d9d313b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83285abf13435688b3341f479e81eb617eacf5bb70ac2ea2c9218d9d9d313b9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.307893 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.328683 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.347576 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.379765 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8712bdb-10cc-4cf7-aa21-7f34761ec6cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b37236dcd27e2d6048ee4976e2f411df9eaf77c3bff9f07f06243e70b01cb63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72381fa08fdd672973e1c6e175291b3d89a8f375117c23c7efc939fead3f62ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61ec168f9dd0b793cb746abec5fd0922cbabf69b71ae55d81c0523be8fac9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95584c308545e3ea96b9d6bd6712f4e8d355baeae70edf62e3499a0e984a13af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e2155b0ec1873858607c3d52b287a5533d1bbf707b7a39b4ca93e73177b2b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfd368c8a34e43275736777dadfbca8025018085d520e2d7ec2bac159e23eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfd368c8a34e43275736777dadfbca8025018085d520e2d7ec2bac159e23eb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc75e6f1c2480daa5a86dd017586fdda8ff44989f50d245bb77861ca6f575d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bc75e6f1c2480daa5a86dd017586fdda8ff44989f50d245bb77861ca6f575d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8e936da2913c3886d6abaf2e5e1a4b2e9cd3bc3f2d9c613258ac1437ce749673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e936da2913c3886d6abaf2e5e1a4b2e9cd3bc3f2d9c613258ac1437ce749673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.383267 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.383333 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.383350 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.383374 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.383393 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:20Z","lastTransitionTime":"2026-02-19T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.402518 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.419577 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.435966 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.483841 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:20 crc kubenswrapper[4873]: E0219 09:46:20.484001 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.486040 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.486098 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.486143 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.486166 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.486183 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:20Z","lastTransitionTime":"2026-02-19T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.508863 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 08:38:51.165205996 +0000 UTC Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.588954 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.589007 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.589025 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.589045 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.589059 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:20Z","lastTransitionTime":"2026-02-19T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.691634 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.692093 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.692396 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.692537 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.692699 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:20Z","lastTransitionTime":"2026-02-19T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.796610 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.796955 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.797177 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.797397 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.797551 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:20Z","lastTransitionTime":"2026-02-19T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.900318 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.900377 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.900400 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.900429 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:20 crc kubenswrapper[4873]: I0219 09:46:20.900450 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:20Z","lastTransitionTime":"2026-02-19T09:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.003186 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.003233 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.003247 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.003266 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.003280 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:21Z","lastTransitionTime":"2026-02-19T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.105507 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.105545 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.105555 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.105569 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.105578 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:21Z","lastTransitionTime":"2026-02-19T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.208316 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.208372 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.208390 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.208414 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.208433 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:21Z","lastTransitionTime":"2026-02-19T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.311800 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.311847 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.311864 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.311887 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.311903 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:21Z","lastTransitionTime":"2026-02-19T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.414811 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.414838 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.414847 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.414858 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.414869 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:21Z","lastTransitionTime":"2026-02-19T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.483714 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.484352 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:21 crc kubenswrapper[4873]: E0219 09:46:21.484619 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.484458 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:21 crc kubenswrapper[4873]: E0219 09:46:21.484733 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:21 crc kubenswrapper[4873]: E0219 09:46:21.484891 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.509539 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 04:16:57.266972742 +0000 UTC Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.512674 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7760a15-9ea0-42f0-b42b-72de30071d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:46:18Z\\\",\\\"message\\\":\\\"alName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0219 09:46:18.424282 6989 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nF0219 09:46:18.424294 6989 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:46:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j94bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.517682 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.517728 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.517743 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.517763 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.517779 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:21Z","lastTransitionTime":"2026-02-19T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.525213 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kbv7k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fdff17-cdda-468e-8520-7f0937acd8db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93aee03e84febb69ba4a4a4d76aeb12d03ce9c7e0c00e318241f2286c3e548f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gql7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kbv7k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.539401 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6df925a-1654-4ade-a300-97c316b0867f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0219 09:45:11.226743 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0219 09:45:11.226899 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 09:45:11.227747 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3099442580/tls.crt::/tmp/serving-cert-3099442580/tls.key\\\\\\\"\\\\nI0219 09:45:11.466170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 09:45:11.471307 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 09:45:11.471337 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 09:45:11.471373 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 09:45:11.471384 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 09:45:11.491622 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 09:45:11.491660 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491669 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 09:45:11.491679 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 09:45:11.491686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 09:45:11.491692 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 09:45:11.491698 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 09:45:11.491774 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 09:45:11.493330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.555519 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf63b7395a16d736e40432eaf3de6a5fdc32607519a49bd98dc44a1c190682ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.574276 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f6e9aa67864d205ea3b5abf40cfff62e5f3e5edc119a2871696e283a1e90162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d9df67bb7a7bc15eec03cf919cd4ed3f1d3d10f2ed97590a49f6882fbbe8675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.587502 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.609166 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pp77w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8bbad50-17a6-49b3-aa6a-3d8bcf05f5ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28725576a9ad3dbc36c016d908d30b313f11f977e0cc96d07d94561b473b707b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pp77w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.619904 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.619952 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.619965 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.619984 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.619998 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:21Z","lastTransitionTime":"2026-02-19T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.625595 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee7c8560-6ad7-4360-9a1d-3137133ef615\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610f40cd2f97208fe9d6c9f3accc665ff4d1276e227f799b64e86b02be681a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83285abf13435688b3341f479e81eb617eacf5bb70ac2ea2c9218d9d9d313b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83285abf13435688b3341f479e81eb617eacf5bb70ac2ea2c9218d9d9d313b9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.647958 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39a3f6b0-3581-4f36-85f6-24fa414b73f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15317bf5117d3cfd841bcc7a4475aeafb262813b4aa756a3b18eb4ccb598da3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dbf05928c62c4c7f722f2d1fbd6a2c7bfc3c1306c17dcc32780239860cc8220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f4e9759da0e3aeedb86b3d7c95d090eb2326b9febb35ebd7dbe7f78abc2677c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://034d4dad3ade3a0cad5aea021ed97933978f290a04b8b71d38fa565d46a29e9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.664952 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c605647e0eb6483375a17f468fb0a3c0d91778943b69f2b28b8557484c1db7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.675977 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98d35597-056d-48f0-b599-28b098dd45f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvs9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:31Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lcp8k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.693173 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8712bdb-10cc-4cf7-aa21-7f34761ec6cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b37236dcd27e2d6048ee4976e2f411df9eaf77c3bff9f07f06243e70b01cb63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72381fa08fdd672973e1c6e175291b3d89a8f375117c23c7efc939fead3f62ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d61ec168f9dd0b793cb746abec5fd0922cbabf69b71ae55d81c0523be8fac9d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95584c308545e3ea96b9d6bd6712f4e8d355baeae70edf62e3499a0e984a13af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e2155b0ec1873858607c3d52b287a5533d1bbf707b7a39b4ca93e73177b2b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfd368c8a34e43275736777dadfbca8025018085d520e2d7ec2bac159e23eb7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfd368c8a34e43275736777dadfbca8025018085d520e2d7ec2bac159e23eb7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc75e6f1c2480daa5a86dd017586fdda8ff44989f50d245bb77861ca6f575d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bc75e6f1c2480daa5a86dd017586fdda8ff44989f50d245bb77861ca6f575d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8e936da2913c3886d6abaf2e5e1a4b2e9cd3bc3f2d9c613258ac1437ce749673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e936da2913c3886d6abaf2e5e1a4b2e9cd3bc3f2d9c613258ac1437ce749673\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:44:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.704921 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.717096 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c61760e-2955-4688-b68b-1ceeda73f356\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f93c9f80e1dd0497f0fe105c54dc34c55e39f55d781938dcd452eea9de7b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgmwg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qmsl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.722987 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.723031 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.723045 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.723072 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.723084 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:21Z","lastTransitionTime":"2026-02-19T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.742218 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acb9409d-e5b1-4d32-9200-8dc32d8923d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://566b71024f32af15bc24bee5fc194fca3a17f7b2f998034c0b606befbd75a91e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844e6eff3289a73700f5f016ff416dffa2d8fb519cc6daa013a89b77c86a8795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f83e0d0438b8f7f877ec029e26bbb7ac26a28a8b5e4d636f2dc117773afa505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab0e336ae80a0a11faa1f31c84c8d64ecdfb1edb4605288dab414b9c6764189\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8592389e4b69c42b9025726509f03f6864467be724f61cd8a949e468efb8a9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34000d67cc74d05b2123daa28bd03f66cb5e5649d0b22e3e15eb035956d146a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5be40215c68ae53a4516ce126de7e3a6bf897ca543f0dc39bbadd4c3d0fb0d39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T09:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr6v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n2lwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.756799 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"566305a3-ea47-4e60-b247-5b32fa8544e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96a5934abcf2b7d61e3ecde971dba8329b4dfb5565d3e926106cf023650a86d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc5e33551a9c1389c85f629b887464d293d457f3206e462c8758a01fbaa6dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:45:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpvr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:30Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t7gjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.772472 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7cd1737-26f8-469d-a081-673330afcc5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:44:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0891943575e34f299ce1c864c89d88cf0653fc114ccddd336460482a1ed05b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13a18034acac9f1de3bcd5002e36a77980bc97f6545ec6ad50d2101b7dff95b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5a02c6fa1232d8b067f89cdfc3ceded8077483d001343a7c8a7aeacb732627f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:44:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:44:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.788947 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.807132 4873 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4pk8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1ae3d8d-27cf-489f-a6ba-ef914db74bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:45:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T09:46:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T09:46:04Z\\\",\\\"message\\\":\\\"2026-02-19T09:45:19+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46\\\\n2026-02-19T09:45:19+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_00b4ab66-db85-4e00-bc50-f10b22269f46 to /host/opt/cni/bin/\\\\n2026-02-19T09:45:19Z [verbose] multus-daemon started\\\\n2026-02-19T09:45:19Z [verbose] Readiness Indicator file check\\\\n2026-02-19T09:46:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T09:45:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T09:46:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnjnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T09:45:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4pk8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T09:46:21Z is after 2025-08-24T17:21:41Z" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.826226 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.826291 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.826316 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.826347 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.826367 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:21Z","lastTransitionTime":"2026-02-19T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.929890 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.929953 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.929973 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.930000 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:21 crc kubenswrapper[4873]: I0219 09:46:21.930019 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:21Z","lastTransitionTime":"2026-02-19T09:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.033347 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.033407 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.033425 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.033450 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.033467 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:22Z","lastTransitionTime":"2026-02-19T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.137191 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.137256 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.137279 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.137307 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.137327 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:22Z","lastTransitionTime":"2026-02-19T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.240884 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.240967 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.240986 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.241011 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.241031 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:22Z","lastTransitionTime":"2026-02-19T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.343372 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.343420 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.343455 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.343473 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.343485 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:22Z","lastTransitionTime":"2026-02-19T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.446261 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.446337 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.446362 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.446393 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.446416 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:22Z","lastTransitionTime":"2026-02-19T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.484169 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:22 crc kubenswrapper[4873]: E0219 09:46:22.484353 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.509913 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 16:59:40.704236036 +0000 UTC Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.549865 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.550009 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.550033 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.550062 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.550086 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:22Z","lastTransitionTime":"2026-02-19T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.653305 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.653355 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.653369 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.653387 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.653399 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:22Z","lastTransitionTime":"2026-02-19T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.755881 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.755935 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.755952 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.755977 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.755994 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:22Z","lastTransitionTime":"2026-02-19T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.859252 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.859308 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.859329 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.859354 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.859372 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:22Z","lastTransitionTime":"2026-02-19T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.962306 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.962365 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.962383 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.962407 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:22 crc kubenswrapper[4873]: I0219 09:46:22.962429 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:22Z","lastTransitionTime":"2026-02-19T09:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.065022 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.065085 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.065130 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.065155 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.065179 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:23Z","lastTransitionTime":"2026-02-19T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.167750 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.167809 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.167835 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.167867 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.167890 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:23Z","lastTransitionTime":"2026-02-19T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.271577 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.271636 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.271653 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.271676 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.271695 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:23Z","lastTransitionTime":"2026-02-19T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.375085 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.375186 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.375212 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.375245 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.375266 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:23Z","lastTransitionTime":"2026-02-19T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.478156 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.478207 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.478220 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.478235 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.478249 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:23Z","lastTransitionTime":"2026-02-19T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.483740 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.483874 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:23 crc kubenswrapper[4873]: E0219 09:46:23.483914 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.483980 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:23 crc kubenswrapper[4873]: E0219 09:46:23.484091 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:23 crc kubenswrapper[4873]: E0219 09:46:23.484225 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.510858 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 04:14:47.536313976 +0000 UTC Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.581294 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.581369 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.581389 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.581415 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.581437 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:23Z","lastTransitionTime":"2026-02-19T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.685365 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.685455 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.685475 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.685501 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.685518 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:23Z","lastTransitionTime":"2026-02-19T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.788577 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.788634 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.788652 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.788675 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.788692 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:23Z","lastTransitionTime":"2026-02-19T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.891884 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.891948 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.891966 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.891991 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.892009 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:23Z","lastTransitionTime":"2026-02-19T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.995806 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.995864 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.995884 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.995908 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:23 crc kubenswrapper[4873]: I0219 09:46:23.995926 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:23Z","lastTransitionTime":"2026-02-19T09:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.098995 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.099061 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.099082 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.099130 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.099148 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:24Z","lastTransitionTime":"2026-02-19T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.202145 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.202225 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.202248 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.202275 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.202293 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:24Z","lastTransitionTime":"2026-02-19T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.305891 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.305997 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.306017 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.306041 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.306079 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:24Z","lastTransitionTime":"2026-02-19T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.408996 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.409139 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.409160 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.409184 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.409200 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:24Z","lastTransitionTime":"2026-02-19T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.483940 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:24 crc kubenswrapper[4873]: E0219 09:46:24.484162 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.511391 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 03:48:27.523658403 +0000 UTC Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.512420 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.512492 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.512516 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.512545 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.512571 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:24Z","lastTransitionTime":"2026-02-19T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.616381 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.616458 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.616479 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.616959 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.617015 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:24Z","lastTransitionTime":"2026-02-19T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.720515 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.720597 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.720621 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.720649 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.720670 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:24Z","lastTransitionTime":"2026-02-19T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.823595 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.823656 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.823674 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.823699 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.823722 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:24Z","lastTransitionTime":"2026-02-19T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.926670 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.926734 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.926750 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.926778 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:24 crc kubenswrapper[4873]: I0219 09:46:24.926801 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:24Z","lastTransitionTime":"2026-02-19T09:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.029161 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.029270 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.029290 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.029313 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.029330 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:25Z","lastTransitionTime":"2026-02-19T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.132507 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.132571 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.132589 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.132613 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.132629 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:25Z","lastTransitionTime":"2026-02-19T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.235709 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.235779 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.235797 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.235827 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.235849 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:25Z","lastTransitionTime":"2026-02-19T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.338505 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.338561 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.338577 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.338603 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.338621 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:25Z","lastTransitionTime":"2026-02-19T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.442216 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.442284 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.442301 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.442324 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.442341 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:25Z","lastTransitionTime":"2026-02-19T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.483756 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.483841 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:25 crc kubenswrapper[4873]: E0219 09:46:25.483914 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:25 crc kubenswrapper[4873]: E0219 09:46:25.484026 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.484149 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:25 crc kubenswrapper[4873]: E0219 09:46:25.484254 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.548471 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 10:41:48.728004914 +0000 UTC Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.550516 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.550564 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.550580 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.550601 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.550618 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:25Z","lastTransitionTime":"2026-02-19T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.653740 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.653807 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.653827 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.653854 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.653873 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:25Z","lastTransitionTime":"2026-02-19T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.756916 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.756984 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.757001 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.757028 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.757046 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:25Z","lastTransitionTime":"2026-02-19T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.860550 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.860627 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.860645 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.860671 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.860689 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:25Z","lastTransitionTime":"2026-02-19T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.963340 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.963419 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.963445 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.963476 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:25 crc kubenswrapper[4873]: I0219 09:46:25.963500 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:25Z","lastTransitionTime":"2026-02-19T09:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.066700 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.066775 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.066797 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.066826 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.066848 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:26Z","lastTransitionTime":"2026-02-19T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.170149 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.170205 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.170223 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.170249 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.170266 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:26Z","lastTransitionTime":"2026-02-19T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.274017 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.274657 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.274816 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.274951 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.275077 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:26Z","lastTransitionTime":"2026-02-19T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.378740 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.378853 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.378871 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.378897 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.378914 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:26Z","lastTransitionTime":"2026-02-19T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.482007 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.482077 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.482142 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.482180 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.482203 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:26Z","lastTransitionTime":"2026-02-19T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.483286 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:26 crc kubenswrapper[4873]: E0219 09:46:26.483647 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.549257 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 08:55:49.251107807 +0000 UTC Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.584320 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.584374 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.584392 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.584416 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.584435 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:26Z","lastTransitionTime":"2026-02-19T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.686678 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.686729 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.686740 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.686761 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.686775 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:26Z","lastTransitionTime":"2026-02-19T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.789750 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.789819 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.789839 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.789863 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.789880 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:26Z","lastTransitionTime":"2026-02-19T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.893845 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.893904 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.893921 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.893946 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.893964 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:26Z","lastTransitionTime":"2026-02-19T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.996924 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.996987 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.997008 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.997038 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:26 crc kubenswrapper[4873]: I0219 09:46:26.997061 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:26Z","lastTransitionTime":"2026-02-19T09:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.100436 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.100501 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.100521 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.100547 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.100569 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:27Z","lastTransitionTime":"2026-02-19T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.204242 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.204326 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.204351 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.204376 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.204395 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:27Z","lastTransitionTime":"2026-02-19T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.306928 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.307081 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.307136 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.307164 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.307182 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:27Z","lastTransitionTime":"2026-02-19T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.410723 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.410785 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.410805 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.410830 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.410848 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:27Z","lastTransitionTime":"2026-02-19T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.484082 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.484151 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.484194 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:27 crc kubenswrapper[4873]: E0219 09:46:27.484321 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:27 crc kubenswrapper[4873]: E0219 09:46:27.484435 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:27 crc kubenswrapper[4873]: E0219 09:46:27.484668 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.513385 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.513442 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.513462 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.513486 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.513504 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:27Z","lastTransitionTime":"2026-02-19T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.549746 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 09:22:53.423393531 +0000 UTC Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.616225 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.616279 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.616297 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.616321 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.616338 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:27Z","lastTransitionTime":"2026-02-19T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.719553 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.719630 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.719656 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.719686 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.719707 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:27Z","lastTransitionTime":"2026-02-19T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.823386 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.823488 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.823507 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.823537 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.823555 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:27Z","lastTransitionTime":"2026-02-19T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.926354 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.926422 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.926440 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.926465 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:27 crc kubenswrapper[4873]: I0219 09:46:27.926482 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:27Z","lastTransitionTime":"2026-02-19T09:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.029055 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.029167 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.029199 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.029230 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.029251 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:28Z","lastTransitionTime":"2026-02-19T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.132482 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.132565 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.132588 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.132614 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.132632 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:28Z","lastTransitionTime":"2026-02-19T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.236482 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.236551 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.236570 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.236596 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.236614 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:28Z","lastTransitionTime":"2026-02-19T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.339680 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.339738 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.339756 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.339786 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.339810 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:28Z","lastTransitionTime":"2026-02-19T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.442624 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.443036 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.443238 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.443545 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.443745 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:28Z","lastTransitionTime":"2026-02-19T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.483506 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:28 crc kubenswrapper[4873]: E0219 09:46:28.483850 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.547929 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.547990 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.548008 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.548033 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.548050 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:28Z","lastTransitionTime":"2026-02-19T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.550523 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 03:02:11.473186163 +0000 UTC Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.651983 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.652047 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.652065 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.652171 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.652192 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:28Z","lastTransitionTime":"2026-02-19T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.755524 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.755589 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.755612 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.755640 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.755663 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:28Z","lastTransitionTime":"2026-02-19T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.858278 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.858556 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.858685 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.858786 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.858877 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:28Z","lastTransitionTime":"2026-02-19T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.961593 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.961645 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.961657 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.961681 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:28 crc kubenswrapper[4873]: I0219 09:46:28.961696 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:28Z","lastTransitionTime":"2026-02-19T09:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.064663 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.065171 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.065342 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.065487 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.065617 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:29Z","lastTransitionTime":"2026-02-19T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.168871 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.169262 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.169426 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.169652 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.169807 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:29Z","lastTransitionTime":"2026-02-19T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.273578 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.273642 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.273663 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.273687 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.273707 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:29Z","lastTransitionTime":"2026-02-19T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.317190 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.317256 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.317278 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.317307 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.317329 4873 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T09:46:29Z","lastTransitionTime":"2026-02-19T09:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.387331 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx"] Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.387714 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.392553 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.392570 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.392673 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.392671 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.443410 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=15.443379186 podStartE2EDuration="15.443379186s" podCreationTimestamp="2026-02-19 09:46:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:29.442909835 +0000 UTC m=+98.732341553" watchObservedRunningTime="2026-02-19 09:46:29.443379186 +0000 UTC m=+98.732810864" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.485841 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.486218 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0885c13-02f8-4892-8f84-bcb38f36cfe8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.485875 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.485966 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.486281 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c0885c13-02f8-4892-8f84-bcb38f36cfe8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.486393 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c0885c13-02f8-4892-8f84-bcb38f36cfe8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.486425 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0885c13-02f8-4892-8f84-bcb38f36cfe8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.486500 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0885c13-02f8-4892-8f84-bcb38f36cfe8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: E0219 09:46:29.486954 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:29 crc kubenswrapper[4873]: E0219 09:46:29.487083 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:29 crc kubenswrapper[4873]: E0219 09:46:29.487259 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.512947 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podStartSLOduration=72.512897586 podStartE2EDuration="1m12.512897586s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:29.485421557 +0000 UTC m=+98.774853245" watchObservedRunningTime="2026-02-19 09:46:29.512897586 +0000 UTC m=+98.802329264" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.529341 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.52930999 podStartE2EDuration="1m15.52930999s" podCreationTimestamp="2026-02-19 09:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:29.527390017 +0000 UTC m=+98.816821705" watchObservedRunningTime="2026-02-19 09:46:29.52930999 +0000 UTC m=+98.818741678" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.551128 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 17:57:03.697731662 +0000 UTC Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.551243 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.560450 4873 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.587150 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0885c13-02f8-4892-8f84-bcb38f36cfe8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.587197 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0885c13-02f8-4892-8f84-bcb38f36cfe8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.587218 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c0885c13-02f8-4892-8f84-bcb38f36cfe8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.587245 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c0885c13-02f8-4892-8f84-bcb38f36cfe8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.587261 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0885c13-02f8-4892-8f84-bcb38f36cfe8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.587389 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c0885c13-02f8-4892-8f84-bcb38f36cfe8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.587430 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c0885c13-02f8-4892-8f84-bcb38f36cfe8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.588202 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0885c13-02f8-4892-8f84-bcb38f36cfe8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.596483 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0885c13-02f8-4892-8f84-bcb38f36cfe8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.598708 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4pk8x" podStartSLOduration=72.598690487 podStartE2EDuration="1m12.598690487s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:29.569683284 +0000 UTC m=+98.859114962" watchObservedRunningTime="2026-02-19 09:46:29.598690487 +0000 UTC m=+98.888122155" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.613565 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0885c13-02f8-4892-8f84-bcb38f36cfe8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wdlmx\" (UID: \"c0885c13-02f8-4892-8f84-bcb38f36cfe8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.620612 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n2lwn" podStartSLOduration=72.620590802 podStartE2EDuration="1m12.620590802s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:29.599665589 +0000 UTC m=+98.889097237" watchObservedRunningTime="2026-02-19 09:46:29.620590802 +0000 UTC m=+98.910022450" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.621712 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t7gjb" podStartSLOduration=72.621697607 podStartE2EDuration="1m12.621697607s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:29.621209396 +0000 UTC m=+98.910641074" watchObservedRunningTime="2026-02-19 09:46:29.621697607 +0000 UTC m=+98.911129255" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.638190 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.638171892 podStartE2EDuration="1m18.638171892s" podCreationTimestamp="2026-02-19 09:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:29.637883186 +0000 UTC m=+98.927314834" watchObservedRunningTime="2026-02-19 09:46:29.638171892 +0000 UTC m=+98.927603520" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.709491 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.725513 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kbv7k" podStartSLOduration=72.725495467 podStartE2EDuration="1m12.725495467s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:29.707238412 +0000 UTC m=+98.996670060" watchObservedRunningTime="2026-02-19 09:46:29.725495467 +0000 UTC m=+99.014927115" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.743984 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=21.743950576 podStartE2EDuration="21.743950576s" podCreationTimestamp="2026-02-19 09:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:29.741833969 +0000 UTC m=+99.031265647" watchObservedRunningTime="2026-02-19 09:46:29.743950576 +0000 UTC m=+99.033382234" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.761827 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.761806052 podStartE2EDuration="43.761806052s" podCreationTimestamp="2026-02-19 09:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:29.760353079 +0000 UTC m=+99.049784757" watchObservedRunningTime="2026-02-19 09:46:29.761806052 +0000 UTC m=+99.051237690" Feb 19 09:46:29 crc kubenswrapper[4873]: I0219 09:46:29.810021 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pp77w" podStartSLOduration=72.81000334 podStartE2EDuration="1m12.81000334s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:29.809439517 +0000 UTC m=+99.098871165" watchObservedRunningTime="2026-02-19 09:46:29.81000334 +0000 UTC m=+99.099434978" Feb 19 09:46:30 crc kubenswrapper[4873]: I0219 09:46:30.054250 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" event={"ID":"c0885c13-02f8-4892-8f84-bcb38f36cfe8","Type":"ContainerStarted","Data":"256e3d62682ab17bf2f38c8eed08c57aa7dbf1ea131685badc363bc76215eb08"} Feb 19 09:46:30 crc kubenswrapper[4873]: I0219 09:46:30.054335 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" event={"ID":"c0885c13-02f8-4892-8f84-bcb38f36cfe8","Type":"ContainerStarted","Data":"034d90cb62e1ad92539f0b2283896614df219d6eb9bf6d5ebfee1928b31f365d"} Feb 19 09:46:30 crc kubenswrapper[4873]: I0219 09:46:30.075327 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wdlmx" podStartSLOduration=73.075298728 podStartE2EDuration="1m13.075298728s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:30.073549429 +0000 UTC m=+99.362981117" watchObservedRunningTime="2026-02-19 09:46:30.075298728 +0000 UTC m=+99.364730396" Feb 19 09:46:30 crc kubenswrapper[4873]: I0219 09:46:30.483632 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:30 crc kubenswrapper[4873]: E0219 09:46:30.483857 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:31 crc kubenswrapper[4873]: I0219 09:46:31.483658 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:31 crc kubenswrapper[4873]: E0219 09:46:31.485559 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:31 crc kubenswrapper[4873]: I0219 09:46:31.485676 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:31 crc kubenswrapper[4873]: I0219 09:46:31.485773 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:31 crc kubenswrapper[4873]: E0219 09:46:31.486024 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:31 crc kubenswrapper[4873]: E0219 09:46:31.486130 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:32 crc kubenswrapper[4873]: I0219 09:46:32.483284 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:32 crc kubenswrapper[4873]: E0219 09:46:32.483648 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:33 crc kubenswrapper[4873]: I0219 09:46:33.484065 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:33 crc kubenswrapper[4873]: I0219 09:46:33.484189 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:33 crc kubenswrapper[4873]: I0219 09:46:33.484099 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:33 crc kubenswrapper[4873]: E0219 09:46:33.484510 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:33 crc kubenswrapper[4873]: E0219 09:46:33.484789 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:33 crc kubenswrapper[4873]: E0219 09:46:33.484914 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:34 crc kubenswrapper[4873]: I0219 09:46:34.483465 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:34 crc kubenswrapper[4873]: E0219 09:46:34.483623 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:34 crc kubenswrapper[4873]: I0219 09:46:34.484210 4873 scope.go:117] "RemoveContainer" containerID="e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308" Feb 19 09:46:34 crc kubenswrapper[4873]: E0219 09:46:34.484372 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" Feb 19 09:46:35 crc kubenswrapper[4873]: I0219 09:46:35.484023 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:35 crc kubenswrapper[4873]: I0219 09:46:35.484196 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:35 crc kubenswrapper[4873]: I0219 09:46:35.484201 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:35 crc kubenswrapper[4873]: E0219 09:46:35.484368 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:35 crc kubenswrapper[4873]: E0219 09:46:35.484514 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:35 crc kubenswrapper[4873]: E0219 09:46:35.484640 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:35 crc kubenswrapper[4873]: I0219 09:46:35.858699 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:35 crc kubenswrapper[4873]: E0219 09:46:35.858913 4873 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:46:35 crc kubenswrapper[4873]: E0219 09:46:35.859450 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs podName:98d35597-056d-48f0-b599-28b098dd45f3 nodeName:}" failed. No retries permitted until 2026-02-19 09:47:39.859417676 +0000 UTC m=+169.148849354 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs") pod "network-metrics-daemon-lcp8k" (UID: "98d35597-056d-48f0-b599-28b098dd45f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 09:46:36 crc kubenswrapper[4873]: I0219 09:46:36.483811 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:36 crc kubenswrapper[4873]: E0219 09:46:36.483977 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:37 crc kubenswrapper[4873]: I0219 09:46:37.483715 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:37 crc kubenswrapper[4873]: I0219 09:46:37.483757 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:37 crc kubenswrapper[4873]: I0219 09:46:37.483767 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:37 crc kubenswrapper[4873]: E0219 09:46:37.484722 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:37 crc kubenswrapper[4873]: E0219 09:46:37.484762 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:37 crc kubenswrapper[4873]: E0219 09:46:37.484572 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:38 crc kubenswrapper[4873]: I0219 09:46:38.483419 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:38 crc kubenswrapper[4873]: E0219 09:46:38.483589 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:39 crc kubenswrapper[4873]: I0219 09:46:39.483560 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:39 crc kubenswrapper[4873]: I0219 09:46:39.483637 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:39 crc kubenswrapper[4873]: E0219 09:46:39.484082 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:39 crc kubenswrapper[4873]: I0219 09:46:39.483638 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:39 crc kubenswrapper[4873]: E0219 09:46:39.484256 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:39 crc kubenswrapper[4873]: E0219 09:46:39.484338 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:40 crc kubenswrapper[4873]: I0219 09:46:40.483331 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:40 crc kubenswrapper[4873]: E0219 09:46:40.483508 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:41 crc kubenswrapper[4873]: I0219 09:46:41.483913 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:41 crc kubenswrapper[4873]: I0219 09:46:41.484137 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:41 crc kubenswrapper[4873]: I0219 09:46:41.484198 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:41 crc kubenswrapper[4873]: E0219 09:46:41.486059 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:41 crc kubenswrapper[4873]: E0219 09:46:41.486295 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:41 crc kubenswrapper[4873]: E0219 09:46:41.486396 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:42 crc kubenswrapper[4873]: I0219 09:46:42.484071 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:42 crc kubenswrapper[4873]: E0219 09:46:42.484321 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:43 crc kubenswrapper[4873]: I0219 09:46:43.484020 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:43 crc kubenswrapper[4873]: I0219 09:46:43.484029 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:43 crc kubenswrapper[4873]: I0219 09:46:43.484167 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:43 crc kubenswrapper[4873]: E0219 09:46:43.484487 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:43 crc kubenswrapper[4873]: E0219 09:46:43.484648 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:43 crc kubenswrapper[4873]: E0219 09:46:43.484830 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:44 crc kubenswrapper[4873]: I0219 09:46:44.483723 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:44 crc kubenswrapper[4873]: E0219 09:46:44.483883 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:45 crc kubenswrapper[4873]: I0219 09:46:45.483640 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:45 crc kubenswrapper[4873]: I0219 09:46:45.483692 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:45 crc kubenswrapper[4873]: I0219 09:46:45.483659 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:45 crc kubenswrapper[4873]: E0219 09:46:45.483827 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:45 crc kubenswrapper[4873]: E0219 09:46:45.484182 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:45 crc kubenswrapper[4873]: E0219 09:46:45.484020 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:46 crc kubenswrapper[4873]: I0219 09:46:46.483941 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:46 crc kubenswrapper[4873]: E0219 09:46:46.484086 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:47 crc kubenswrapper[4873]: I0219 09:46:47.483254 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:47 crc kubenswrapper[4873]: I0219 09:46:47.483268 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:47 crc kubenswrapper[4873]: I0219 09:46:47.483388 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:47 crc kubenswrapper[4873]: E0219 09:46:47.483511 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:47 crc kubenswrapper[4873]: E0219 09:46:47.483659 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:47 crc kubenswrapper[4873]: E0219 09:46:47.484181 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:47 crc kubenswrapper[4873]: I0219 09:46:47.484683 4873 scope.go:117] "RemoveContainer" containerID="e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308" Feb 19 09:46:47 crc kubenswrapper[4873]: E0219 09:46:47.485004 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j94bh_openshift-ovn-kubernetes(a7760a15-9ea0-42f0-b42b-72de30071d14)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" Feb 19 09:46:48 crc kubenswrapper[4873]: I0219 09:46:48.483358 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:48 crc kubenswrapper[4873]: E0219 09:46:48.483517 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:49 crc kubenswrapper[4873]: I0219 09:46:49.483439 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:49 crc kubenswrapper[4873]: I0219 09:46:49.483470 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:49 crc kubenswrapper[4873]: I0219 09:46:49.483527 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:49 crc kubenswrapper[4873]: E0219 09:46:49.484259 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:49 crc kubenswrapper[4873]: E0219 09:46:49.484369 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:49 crc kubenswrapper[4873]: E0219 09:46:49.484444 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:50 crc kubenswrapper[4873]: I0219 09:46:50.483593 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:50 crc kubenswrapper[4873]: E0219 09:46:50.483722 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:51 crc kubenswrapper[4873]: I0219 09:46:51.129747 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4pk8x_e1ae3d8d-27cf-489f-a6ba-ef914db74bff/kube-multus/1.log" Feb 19 09:46:51 crc kubenswrapper[4873]: I0219 09:46:51.130556 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4pk8x_e1ae3d8d-27cf-489f-a6ba-ef914db74bff/kube-multus/0.log" Feb 19 09:46:51 crc kubenswrapper[4873]: I0219 09:46:51.130610 4873 generic.go:334] "Generic (PLEG): container finished" podID="e1ae3d8d-27cf-489f-a6ba-ef914db74bff" containerID="81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc" exitCode=1 Feb 19 09:46:51 crc kubenswrapper[4873]: I0219 09:46:51.130646 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4pk8x" event={"ID":"e1ae3d8d-27cf-489f-a6ba-ef914db74bff","Type":"ContainerDied","Data":"81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc"} Feb 19 09:46:51 crc kubenswrapper[4873]: I0219 09:46:51.130685 4873 scope.go:117] "RemoveContainer" containerID="6e5a838a826471a87367bdcbb5dc9b7586d8c2a751ffc5431bab3269e1a0b003" Feb 19 09:46:51 crc kubenswrapper[4873]: I0219 09:46:51.134803 4873 scope.go:117] "RemoveContainer" containerID="81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc" Feb 19 09:46:51 crc kubenswrapper[4873]: E0219 09:46:51.137484 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-4pk8x_openshift-multus(e1ae3d8d-27cf-489f-a6ba-ef914db74bff)\"" pod="openshift-multus/multus-4pk8x" podUID="e1ae3d8d-27cf-489f-a6ba-ef914db74bff" Feb 19 09:46:51 crc kubenswrapper[4873]: E0219 09:46:51.462210 4873 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 19 09:46:51 crc kubenswrapper[4873]: I0219 09:46:51.484167 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:51 crc kubenswrapper[4873]: I0219 09:46:51.484198 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:51 crc kubenswrapper[4873]: I0219 09:46:51.484498 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:51 crc kubenswrapper[4873]: E0219 09:46:51.485846 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:51 crc kubenswrapper[4873]: E0219 09:46:51.486024 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:51 crc kubenswrapper[4873]: E0219 09:46:51.486190 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:51 crc kubenswrapper[4873]: E0219 09:46:51.601395 4873 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 09:46:52 crc kubenswrapper[4873]: I0219 09:46:52.137607 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4pk8x_e1ae3d8d-27cf-489f-a6ba-ef914db74bff/kube-multus/1.log" Feb 19 09:46:52 crc kubenswrapper[4873]: I0219 09:46:52.483926 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:52 crc kubenswrapper[4873]: E0219 09:46:52.484137 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:53 crc kubenswrapper[4873]: I0219 09:46:53.484270 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:53 crc kubenswrapper[4873]: I0219 09:46:53.484287 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:53 crc kubenswrapper[4873]: E0219 09:46:53.484431 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:53 crc kubenswrapper[4873]: I0219 09:46:53.484571 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:53 crc kubenswrapper[4873]: E0219 09:46:53.484630 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:53 crc kubenswrapper[4873]: E0219 09:46:53.484783 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:54 crc kubenswrapper[4873]: I0219 09:46:54.484387 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:54 crc kubenswrapper[4873]: E0219 09:46:54.484542 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:55 crc kubenswrapper[4873]: I0219 09:46:55.483372 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:55 crc kubenswrapper[4873]: I0219 09:46:55.483428 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:55 crc kubenswrapper[4873]: E0219 09:46:55.483670 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:55 crc kubenswrapper[4873]: I0219 09:46:55.483817 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:55 crc kubenswrapper[4873]: E0219 09:46:55.484098 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:55 crc kubenswrapper[4873]: E0219 09:46:55.484638 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:56 crc kubenswrapper[4873]: I0219 09:46:56.483483 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:56 crc kubenswrapper[4873]: E0219 09:46:56.483622 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:56 crc kubenswrapper[4873]: E0219 09:46:56.602787 4873 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 09:46:57 crc kubenswrapper[4873]: I0219 09:46:57.483927 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:57 crc kubenswrapper[4873]: I0219 09:46:57.483937 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:57 crc kubenswrapper[4873]: I0219 09:46:57.484163 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:57 crc kubenswrapper[4873]: E0219 09:46:57.484354 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:57 crc kubenswrapper[4873]: E0219 09:46:57.484446 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:57 crc kubenswrapper[4873]: E0219 09:46:57.484574 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:58 crc kubenswrapper[4873]: I0219 09:46:58.484746 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:46:58 crc kubenswrapper[4873]: E0219 09:46:58.484858 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:46:58 crc kubenswrapper[4873]: I0219 09:46:58.485426 4873 scope.go:117] "RemoveContainer" containerID="e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308" Feb 19 09:46:59 crc kubenswrapper[4873]: I0219 09:46:59.164910 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/3.log" Feb 19 09:46:59 crc kubenswrapper[4873]: I0219 09:46:59.167255 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerStarted","Data":"fbe398acea08ecbb128c7f23474abd3c929b29591afd83ce34befc3c628c7ddb"} Feb 19 09:46:59 crc kubenswrapper[4873]: I0219 09:46:59.167768 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:46:59 crc kubenswrapper[4873]: I0219 09:46:59.202394 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podStartSLOduration=102.202369799 podStartE2EDuration="1m42.202369799s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:46:59.201715865 +0000 UTC m=+128.491147523" watchObservedRunningTime="2026-02-19 09:46:59.202369799 +0000 UTC m=+128.491801457" Feb 19 09:46:59 crc kubenswrapper[4873]: I0219 09:46:59.484140 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:46:59 crc kubenswrapper[4873]: I0219 09:46:59.484211 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:46:59 crc kubenswrapper[4873]: I0219 09:46:59.484159 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:46:59 crc kubenswrapper[4873]: E0219 09:46:59.484317 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:46:59 crc kubenswrapper[4873]: E0219 09:46:59.484410 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:46:59 crc kubenswrapper[4873]: E0219 09:46:59.484609 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:46:59 crc kubenswrapper[4873]: I0219 09:46:59.528902 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lcp8k"] Feb 19 09:47:00 crc kubenswrapper[4873]: I0219 09:47:00.169621 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:47:00 crc kubenswrapper[4873]: E0219 09:47:00.169721 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:47:00 crc kubenswrapper[4873]: I0219 09:47:00.483976 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:47:00 crc kubenswrapper[4873]: E0219 09:47:00.484223 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:47:01 crc kubenswrapper[4873]: I0219 09:47:01.483480 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:47:01 crc kubenswrapper[4873]: I0219 09:47:01.483507 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:47:01 crc kubenswrapper[4873]: I0219 09:47:01.485161 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:47:01 crc kubenswrapper[4873]: E0219 09:47:01.485155 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:47:01 crc kubenswrapper[4873]: E0219 09:47:01.485333 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:47:01 crc kubenswrapper[4873]: E0219 09:47:01.485454 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:47:01 crc kubenswrapper[4873]: E0219 09:47:01.604197 4873 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 09:47:02 crc kubenswrapper[4873]: I0219 09:47:02.483792 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:47:02 crc kubenswrapper[4873]: E0219 09:47:02.485030 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:47:02 crc kubenswrapper[4873]: I0219 09:47:02.484334 4873 scope.go:117] "RemoveContainer" containerID="81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc" Feb 19 09:47:03 crc kubenswrapper[4873]: I0219 09:47:03.187515 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4pk8x_e1ae3d8d-27cf-489f-a6ba-ef914db74bff/kube-multus/1.log" Feb 19 09:47:03 crc kubenswrapper[4873]: I0219 09:47:03.187889 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4pk8x" event={"ID":"e1ae3d8d-27cf-489f-a6ba-ef914db74bff","Type":"ContainerStarted","Data":"ef1d74ca48faafc4bbde6d98d0cbea070a166074ced1ae06003180d6fd64ebb2"} Feb 19 09:47:03 crc kubenswrapper[4873]: I0219 09:47:03.483488 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:47:03 crc kubenswrapper[4873]: I0219 09:47:03.483509 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:47:03 crc kubenswrapper[4873]: E0219 09:47:03.484023 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:47:03 crc kubenswrapper[4873]: I0219 09:47:03.483603 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:47:03 crc kubenswrapper[4873]: E0219 09:47:03.484259 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:47:03 crc kubenswrapper[4873]: E0219 09:47:03.484986 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:47:04 crc kubenswrapper[4873]: I0219 09:47:04.483406 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:47:04 crc kubenswrapper[4873]: E0219 09:47:04.483624 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:47:05 crc kubenswrapper[4873]: I0219 09:47:05.484082 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:47:05 crc kubenswrapper[4873]: I0219 09:47:05.484177 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:47:05 crc kubenswrapper[4873]: I0219 09:47:05.484126 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:47:05 crc kubenswrapper[4873]: E0219 09:47:05.484885 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lcp8k" podUID="98d35597-056d-48f0-b599-28b098dd45f3" Feb 19 09:47:05 crc kubenswrapper[4873]: E0219 09:47:05.485148 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 09:47:05 crc kubenswrapper[4873]: E0219 09:47:05.485838 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 09:47:06 crc kubenswrapper[4873]: I0219 09:47:06.483419 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:47:06 crc kubenswrapper[4873]: E0219 09:47:06.483603 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 09:47:07 crc kubenswrapper[4873]: I0219 09:47:07.484164 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:47:07 crc kubenswrapper[4873]: I0219 09:47:07.484265 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:47:07 crc kubenswrapper[4873]: I0219 09:47:07.485139 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:47:07 crc kubenswrapper[4873]: I0219 09:47:07.493958 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 09:47:07 crc kubenswrapper[4873]: I0219 09:47:07.495381 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 09:47:07 crc kubenswrapper[4873]: I0219 09:47:07.495506 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 09:47:07 crc kubenswrapper[4873]: I0219 09:47:07.495795 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 09:47:07 crc kubenswrapper[4873]: I0219 09:47:07.496046 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 09:47:07 crc kubenswrapper[4873]: I0219 09:47:07.496058 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 09:47:08 crc kubenswrapper[4873]: I0219 09:47:08.483996 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.480045 4873 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.547273 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-k627b"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.547780 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.553001 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.553079 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.553213 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.553345 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.553410 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.553516 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.553621 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.553725 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dxcz7"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.554588 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.557350 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.557457 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.558150 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.558276 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.558315 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qvxgz"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.558548 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.558567 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.558818 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.559723 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.559742 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g545"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.559850 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.562179 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.562721 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.562889 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.565268 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.565641 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.571705 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.572537 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.572623 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.572627 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.572855 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.574223 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.575016 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.575802 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.576026 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.575815 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577095 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577141 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577155 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577346 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577368 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577428 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577432 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577440 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577585 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577611 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577694 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577867 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.577988 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.578132 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.578425 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jsc24"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.578896 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.579245 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.579589 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.579721 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.579817 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.579912 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.580050 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.580062 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.580167 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.580300 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.580420 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.580699 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.580798 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.581566 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.581727 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.581994 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gbzll"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.586415 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.587001 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.587767 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.588247 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.593175 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.593809 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.594435 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.594478 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.605281 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.605561 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.605720 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.605984 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.612505 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.613664 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.613699 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.614153 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.615296 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.615525 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.615301 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.616356 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.616808 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.621337 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.621571 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.643557 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-shnwj"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.643805 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dxcz7"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.643850 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.643875 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.644163 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.644291 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.644391 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.644495 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.644533 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.644610 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.644713 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.644868 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.644900 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.644952 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.645017 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.645036 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.645135 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.646581 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.647010 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.647995 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.648330 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.648714 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.649369 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.649879 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.650392 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.650498 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.651297 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9pq25"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.651648 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9pq25" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.652053 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.652431 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.655065 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656075 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2d87932-1993-464d-b3d2-71025526e1f2-serving-cert\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656168 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656199 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656223 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e9b2e26-976d-498c-88d8-dbddd520c9bf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656250 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csvqh\" (UniqueName: \"kubernetes.io/projected/b2d87932-1993-464d-b3d2-71025526e1f2-kube-api-access-csvqh\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656292 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656315 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-client-ca\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656336 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c1d3a6-23fd-4526-8892-0add23b09a9a-serving-cert\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656375 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656401 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-config\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656422 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656447 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpxsx\" (UniqueName: \"kubernetes.io/projected/9324aa8b-fbce-42bb-b339-0aa2e382efd4-kube-api-access-fpxsx\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656475 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-config\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656495 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbm8s\" (UniqueName: \"kubernetes.io/projected/df659e7d-39ab-41ee-8df5-08896976666c-kube-api-access-tbm8s\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656520 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5968ec26-dea6-4e79-99b1-5954e173d226-etcd-client\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656541 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb18a52e-b1db-406b-a2e8-88a1ae8b05fc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2kgbd\" (UID: \"bb18a52e-b1db-406b-a2e8-88a1ae8b05fc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656561 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd468f98-7720-4f9a-972f-684b96f4f90f-serving-cert\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656587 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5198f9e2-ae27-4804-ab74-0759a5217d89-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jt5wx\" (UID: \"5198f9e2-ae27-4804-ab74-0759a5217d89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656607 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5968ec26-dea6-4e79-99b1-5954e173d226-audit-dir\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656627 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656647 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-client-ca\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656679 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-auth-proxy-config\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656730 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-config\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656751 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltjkk\" (UniqueName: \"kubernetes.io/projected/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-kube-api-access-ltjkk\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656776 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656800 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/df659e7d-39ab-41ee-8df5-08896976666c-images\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656823 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2d87932-1993-464d-b3d2-71025526e1f2-config\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656849 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5198f9e2-ae27-4804-ab74-0759a5217d89-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jt5wx\" (UID: \"5198f9e2-ae27-4804-ab74-0759a5217d89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656872 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656893 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df659e7d-39ab-41ee-8df5-08896976666c-config\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656921 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-etcd-serving-ca\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656955 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e9b2e26-976d-498c-88d8-dbddd520c9bf-audit-policies\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.656983 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657017 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657039 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-config\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657061 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e9b2e26-976d-498c-88d8-dbddd520c9bf-etcd-client\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657080 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/595c8db4-733e-4729-aa34-8be7307043a8-config\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657121 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/595c8db4-733e-4729-aa34-8be7307043a8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657151 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657172 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5968ec26-dea6-4e79-99b1-5954e173d226-encryption-config\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657196 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd4sz\" (UniqueName: \"kubernetes.io/projected/5198f9e2-ae27-4804-ab74-0759a5217d89-kube-api-access-gd4sz\") pod \"openshift-apiserver-operator-796bbdcf4f-jt5wx\" (UID: \"5198f9e2-ae27-4804-ab74-0759a5217d89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657219 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-policies\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657245 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657269 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/df659e7d-39ab-41ee-8df5-08896976666c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657431 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657473 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/595c8db4-733e-4729-aa34-8be7307043a8-service-ca-bundle\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657505 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jkhf\" (UniqueName: \"kubernetes.io/projected/595c8db4-733e-4729-aa34-8be7307043a8-kube-api-access-6jkhf\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657549 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e9b2e26-976d-498c-88d8-dbddd520c9bf-audit-dir\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657587 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e9b2e26-976d-498c-88d8-dbddd520c9bf-serving-cert\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657613 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9e9b2e26-976d-498c-88d8-dbddd520c9bf-encryption-config\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657653 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d87932-1993-464d-b3d2-71025526e1f2-trusted-ca\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657677 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9e9b2e26-976d-498c-88d8-dbddd520c9bf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657708 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5968ec26-dea6-4e79-99b1-5954e173d226-node-pullsecrets\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657745 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-audit\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657765 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r54wb\" (UniqueName: \"kubernetes.io/projected/5968ec26-dea6-4e79-99b1-5954e173d226-kube-api-access-r54wb\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657790 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-machine-approver-tls\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657810 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/595c8db4-733e-4729-aa34-8be7307043a8-serving-cert\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657834 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfvdj\" (UniqueName: \"kubernetes.io/projected/bb18a52e-b1db-406b-a2e8-88a1ae8b05fc-kube-api-access-nfvdj\") pod \"openshift-controller-manager-operator-756b6f6bc6-2kgbd\" (UID: \"bb18a52e-b1db-406b-a2e8-88a1ae8b05fc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657889 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-image-import-ca\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657916 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb18a52e-b1db-406b-a2e8-88a1ae8b05fc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2kgbd\" (UID: \"bb18a52e-b1db-406b-a2e8-88a1ae8b05fc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657937 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll4wz\" (UniqueName: \"kubernetes.io/projected/bd468f98-7720-4f9a-972f-684b96f4f90f-kube-api-access-ll4wz\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657966 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l9hc\" (UniqueName: \"kubernetes.io/projected/9e9b2e26-976d-498c-88d8-dbddd520c9bf-kube-api-access-5l9hc\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.657986 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5968ec26-dea6-4e79-99b1-5954e173d226-serving-cert\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.658011 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-dir\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.658033 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmb7k\" (UniqueName: \"kubernetes.io/projected/c8c1d3a6-23fd-4526-8892-0add23b09a9a-kube-api-access-mmb7k\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.661946 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.662461 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.662889 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.663595 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.664013 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.664152 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.665302 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.665310 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.665636 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.665988 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.666499 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lkp4m"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.666553 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.666657 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.666752 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.666836 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.667044 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.667400 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.672163 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.672653 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.672728 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.672991 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.674284 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.682494 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.682932 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-kzpbf"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.683774 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.684792 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.686844 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.691419 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.691954 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.691265 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.694339 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.694617 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.695235 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.697524 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.697567 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.698469 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.700399 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.701163 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.703527 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.711527 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qmrn5"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.711917 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.711935 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.712218 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.712436 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.712566 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.712649 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.713143 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.713457 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.713851 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tjxkj"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.714274 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.714324 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.714381 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.714555 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.714885 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.716961 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.717996 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7hhjq"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.718347 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-86hhq"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.718687 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.718696 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.718722 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.719001 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.720717 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g545"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.720901 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qvxgz"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.721806 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-shnwj"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.722748 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.725680 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.726668 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-2798g"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.727543 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2798g" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.727939 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vklwp"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.729605 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.729633 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.729729 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.729732 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.730548 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.731727 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gbzll"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.732929 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.734228 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.735591 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.736932 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.738749 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9pq25"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.740052 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-k627b"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.741823 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tjxkj"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.742398 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.744088 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.745212 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.747003 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.748262 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lkp4m"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.749349 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.750977 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.757537 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759126 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e9b2e26-976d-498c-88d8-dbddd520c9bf-audit-policies\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759160 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/34f3caca-1b4c-493d-a10b-277b42d7ce72-default-certificate\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759186 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lqpt\" (UniqueName: \"kubernetes.io/projected/3c4f7134-312f-4f1d-a344-80d44d65c371-kube-api-access-5lqpt\") pod \"openshift-config-operator-7777fb866f-gk5mg\" (UID: \"3c4f7134-312f-4f1d-a344-80d44d65c371\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759220 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759241 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-oauth-config\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759263 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwrbb\" (UniqueName: \"kubernetes.io/projected/57d54c43-611a-40f1-b05e-9a0007dbe3ec-kube-api-access-hwrbb\") pod \"migrator-59844c95c7-5tt6k\" (UID: \"57d54c43-611a-40f1-b05e-9a0007dbe3ec\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759293 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d639ff25-343e-4e7c-bd2e-f5fc533923f4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-s67xb\" (UID: \"d639ff25-343e-4e7c-bd2e-f5fc533923f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759315 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de77b9aa-b558-4431-b116-5e1e1cc116f3-config-volume\") pod \"collect-profiles-29524905-jqdfw\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759337 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759358 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-config\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759376 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e9b2e26-976d-498c-88d8-dbddd520c9bf-etcd-client\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759396 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/595c8db4-733e-4729-aa34-8be7307043a8-config\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759416 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/595c8db4-733e-4729-aa34-8be7307043a8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759438 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fz4c\" (UniqueName: \"kubernetes.io/projected/d639ff25-343e-4e7c-bd2e-f5fc533923f4-kube-api-access-2fz4c\") pod \"control-plane-machine-set-operator-78cbb6b69f-s67xb\" (UID: \"d639ff25-343e-4e7c-bd2e-f5fc533923f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759461 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759479 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5968ec26-dea6-4e79-99b1-5954e173d226-encryption-config\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759499 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-serving-cert\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759519 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34f3caca-1b4c-493d-a10b-277b42d7ce72-service-ca-bundle\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759541 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad63def1-18c4-4841-a936-b7c7e42ce092-config\") pod \"service-ca-operator-777779d784-wfq9w\" (UID: \"ad63def1-18c4-4841-a936-b7c7e42ce092\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759562 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd4sz\" (UniqueName: \"kubernetes.io/projected/5198f9e2-ae27-4804-ab74-0759a5217d89-kube-api-access-gd4sz\") pod \"openshift-apiserver-operator-796bbdcf4f-jt5wx\" (UID: \"5198f9e2-ae27-4804-ab74-0759a5217d89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759584 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-policies\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759605 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759627 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/df659e7d-39ab-41ee-8df5-08896976666c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759649 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzhg7\" (UniqueName: \"kubernetes.io/projected/5288b888-1b48-4590-8d10-f3688ba87a41-kube-api-access-rzhg7\") pod \"multus-admission-controller-857f4d67dd-lkp4m\" (UID: \"5288b888-1b48-4590-8d10-f3688ba87a41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759669 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4f7134-312f-4f1d-a344-80d44d65c371-serving-cert\") pod \"openshift-config-operator-7777fb866f-gk5mg\" (UID: \"3c4f7134-312f-4f1d-a344-80d44d65c371\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759689 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759704 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e9b2e26-976d-498c-88d8-dbddd520c9bf-audit-policies\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759710 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/595c8db4-733e-4729-aa34-8be7307043a8-service-ca-bundle\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759730 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jkhf\" (UniqueName: \"kubernetes.io/projected/595c8db4-733e-4729-aa34-8be7307043a8-kube-api-access-6jkhf\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759750 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e9b2e26-976d-498c-88d8-dbddd520c9bf-audit-dir\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759772 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpwks\" (UniqueName: \"kubernetes.io/projected/12ef881d-885a-4215-bd57-27966cb209b8-kube-api-access-fpwks\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759792 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48911b55-fb42-412b-9298-4cba1105a164-config\") pod \"kube-controller-manager-operator-78b949d7b-hs5fr\" (UID: \"48911b55-fb42-412b-9298-4cba1105a164\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759811 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rhkq\" (UniqueName: \"kubernetes.io/projected/34f3caca-1b4c-493d-a10b-277b42d7ce72-kube-api-access-8rhkq\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759830 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e9b2e26-976d-498c-88d8-dbddd520c9bf-serving-cert\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759847 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9e9b2e26-976d-498c-88d8-dbddd520c9bf-encryption-config\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759864 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-audit\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759882 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d87932-1993-464d-b3d2-71025526e1f2-trusted-ca\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759900 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9e9b2e26-976d-498c-88d8-dbddd520c9bf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759920 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79bb3a49-346f-49b7-bb8e-c358105f8035-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f24fn\" (UID: \"79bb3a49-346f-49b7-bb8e-c358105f8035\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759939 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5968ec26-dea6-4e79-99b1-5954e173d226-node-pullsecrets\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759960 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r54wb\" (UniqueName: \"kubernetes.io/projected/5968ec26-dea6-4e79-99b1-5954e173d226-kube-api-access-r54wb\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759978 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-machine-approver-tls\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.759995 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/595c8db4-733e-4729-aa34-8be7307043a8-serving-cert\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760016 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfvdj\" (UniqueName: \"kubernetes.io/projected/bb18a52e-b1db-406b-a2e8-88a1ae8b05fc-kube-api-access-nfvdj\") pod \"openshift-controller-manager-operator-756b6f6bc6-2kgbd\" (UID: \"bb18a52e-b1db-406b-a2e8-88a1ae8b05fc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760035 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48911b55-fb42-412b-9298-4cba1105a164-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hs5fr\" (UID: \"48911b55-fb42-412b-9298-4cba1105a164\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760055 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb18a52e-b1db-406b-a2e8-88a1ae8b05fc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2kgbd\" (UID: \"bb18a52e-b1db-406b-a2e8-88a1ae8b05fc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760076 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5288b888-1b48-4590-8d10-f3688ba87a41-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lkp4m\" (UID: \"5288b888-1b48-4590-8d10-f3688ba87a41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760097 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de77b9aa-b558-4431-b116-5e1e1cc116f3-secret-volume\") pod \"collect-profiles-29524905-jqdfw\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760135 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-image-import-ca\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760155 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l9hc\" (UniqueName: \"kubernetes.io/projected/9e9b2e26-976d-498c-88d8-dbddd520c9bf-kube-api-access-5l9hc\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760174 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll4wz\" (UniqueName: \"kubernetes.io/projected/bd468f98-7720-4f9a-972f-684b96f4f90f-kube-api-access-ll4wz\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760196 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgcd\" (UniqueName: \"kubernetes.io/projected/6f60efd0-54f5-43eb-b824-f8eaa836df60-kube-api-access-rkgcd\") pod \"cluster-samples-operator-665b6dd947-9hwg5\" (UID: \"6f60efd0-54f5-43eb-b824-f8eaa836df60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760219 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmb7k\" (UniqueName: \"kubernetes.io/projected/c8c1d3a6-23fd-4526-8892-0add23b09a9a-kube-api-access-mmb7k\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760239 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5968ec26-dea6-4e79-99b1-5954e173d226-serving-cert\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760259 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-dir\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760280 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760300 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2d87932-1993-464d-b3d2-71025526e1f2-serving-cert\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760330 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e9b2e26-976d-498c-88d8-dbddd520c9bf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760350 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-console-config\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760368 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/06e4a751-614f-49d2-8246-c76419d1ccb4-srv-cert\") pod \"catalog-operator-68c6474976-dg6jw\" (UID: \"06e4a751-614f-49d2-8246-c76419d1ccb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760388 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760410 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csvqh\" (UniqueName: \"kubernetes.io/projected/b2d87932-1993-464d-b3d2-71025526e1f2-kube-api-access-csvqh\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760435 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-oauth-serving-cert\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760470 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760489 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760509 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12ef881d-885a-4215-bd57-27966cb209b8-proxy-tls\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760528 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b881e81-67ed-4c33-a992-da59d7996b9d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bxfwb\" (UID: \"4b881e81-67ed-4c33-a992-da59d7996b9d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760549 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-client-ca\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760568 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c1d3a6-23fd-4526-8892-0add23b09a9a-serving-cert\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760588 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-config\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760608 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760631 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48911b55-fb42-412b-9298-4cba1105a164-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hs5fr\" (UID: \"48911b55-fb42-412b-9298-4cba1105a164\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760651 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff89g\" (UniqueName: \"kubernetes.io/projected/4b881e81-67ed-4c33-a992-da59d7996b9d-kube-api-access-ff89g\") pod \"kube-storage-version-migrator-operator-b67b599dd-bxfwb\" (UID: \"4b881e81-67ed-4c33-a992-da59d7996b9d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760674 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpxsx\" (UniqueName: \"kubernetes.io/projected/9324aa8b-fbce-42bb-b339-0aa2e382efd4-kube-api-access-fpxsx\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760694 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-config\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760716 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5968ec26-dea6-4e79-99b1-5954e173d226-etcd-client\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760736 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbm8s\" (UniqueName: \"kubernetes.io/projected/df659e7d-39ab-41ee-8df5-08896976666c-kube-api-access-tbm8s\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760788 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb18a52e-b1db-406b-a2e8-88a1ae8b05fc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2kgbd\" (UID: \"bb18a52e-b1db-406b-a2e8-88a1ae8b05fc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760812 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd468f98-7720-4f9a-972f-684b96f4f90f-serving-cert\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760831 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-service-ca\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760853 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldgn9\" (UniqueName: \"kubernetes.io/projected/ad63def1-18c4-4841-a936-b7c7e42ce092-kube-api-access-ldgn9\") pod \"service-ca-operator-777779d784-wfq9w\" (UID: \"ad63def1-18c4-4841-a936-b7c7e42ce092\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760872 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ggch\" (UniqueName: \"kubernetes.io/projected/10aa25f4-7549-468a-b42f-19305ad066dd-kube-api-access-9ggch\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760891 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79bb3a49-346f-49b7-bb8e-c358105f8035-config\") pod \"kube-apiserver-operator-766d6c64bb-f24fn\" (UID: \"79bb3a49-346f-49b7-bb8e-c358105f8035\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760914 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f60efd0-54f5-43eb-b824-f8eaa836df60-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9hwg5\" (UID: \"6f60efd0-54f5-43eb-b824-f8eaa836df60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760936 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5198f9e2-ae27-4804-ab74-0759a5217d89-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jt5wx\" (UID: \"5198f9e2-ae27-4804-ab74-0759a5217d89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760955 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5968ec26-dea6-4e79-99b1-5954e173d226-audit-dir\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760975 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.760995 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-client-ca\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761026 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-auth-proxy-config\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761045 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-config\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761064 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltjkk\" (UniqueName: \"kubernetes.io/projected/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-kube-api-access-ltjkk\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761085 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79bb3a49-346f-49b7-bb8e-c358105f8035-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f24fn\" (UID: \"79bb3a49-346f-49b7-bb8e-c358105f8035\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761124 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/06e4a751-614f-49d2-8246-c76419d1ccb4-profile-collector-cert\") pod \"catalog-operator-68c6474976-dg6jw\" (UID: \"06e4a751-614f-49d2-8246-c76419d1ccb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761150 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761171 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/df659e7d-39ab-41ee-8df5-08896976666c-images\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761208 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2d87932-1993-464d-b3d2-71025526e1f2-config\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761229 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-trusted-ca-bundle\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761249 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12ef881d-885a-4215-bd57-27966cb209b8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761269 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b881e81-67ed-4c33-a992-da59d7996b9d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bxfwb\" (UID: \"4b881e81-67ed-4c33-a992-da59d7996b9d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761292 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj2jk\" (UniqueName: \"kubernetes.io/projected/de77b9aa-b558-4431-b116-5e1e1cc116f3-kube-api-access-nj2jk\") pod \"collect-profiles-29524905-jqdfw\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761312 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5198f9e2-ae27-4804-ab74-0759a5217d89-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jt5wx\" (UID: \"5198f9e2-ae27-4804-ab74-0759a5217d89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761333 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761352 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df659e7d-39ab-41ee-8df5-08896976666c-config\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761373 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/34f3caca-1b4c-493d-a10b-277b42d7ce72-stats-auth\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761393 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12ef881d-885a-4215-bd57-27966cb209b8-images\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761411 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34f3caca-1b4c-493d-a10b-277b42d7ce72-metrics-certs\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761435 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-etcd-serving-ca\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761454 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gczv\" (UniqueName: \"kubernetes.io/projected/e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e-kube-api-access-2gczv\") pod \"downloads-7954f5f757-9pq25\" (UID: \"e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e\") " pod="openshift-console/downloads-7954f5f757-9pq25" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761474 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4gld\" (UniqueName: \"kubernetes.io/projected/06e4a751-614f-49d2-8246-c76419d1ccb4-kube-api-access-j4gld\") pod \"catalog-operator-68c6474976-dg6jw\" (UID: \"06e4a751-614f-49d2-8246-c76419d1ccb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761494 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3c4f7134-312f-4f1d-a344-80d44d65c371-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gk5mg\" (UID: \"3c4f7134-312f-4f1d-a344-80d44d65c371\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.761514 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad63def1-18c4-4841-a936-b7c7e42ce092-serving-cert\") pod \"service-ca-operator-777779d784-wfq9w\" (UID: \"ad63def1-18c4-4841-a936-b7c7e42ce092\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.762880 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.764555 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-config\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.764879 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.770904 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-policies\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.771475 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.772350 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-client-ca\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.774131 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-config\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.774228 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p97g8"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.774294 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5968ec26-dea6-4e79-99b1-5954e173d226-audit-dir\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.774736 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-auth-proxy-config\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.775143 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-config\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.775232 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.775682 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2d87932-1993-464d-b3d2-71025526e1f2-config\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.775790 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/df659e7d-39ab-41ee-8df5-08896976666c-images\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.776456 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.777785 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.778274 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5968ec26-dea6-4e79-99b1-5954e173d226-encryption-config\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.778406 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.778809 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df659e7d-39ab-41ee-8df5-08896976666c-config\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.779052 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5198f9e2-ae27-4804-ab74-0759a5217d89-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jt5wx\" (UID: \"5198f9e2-ae27-4804-ab74-0759a5217d89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.782458 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb18a52e-b1db-406b-a2e8-88a1ae8b05fc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2kgbd\" (UID: \"bb18a52e-b1db-406b-a2e8-88a1ae8b05fc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.782704 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-dir\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.783719 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5968ec26-dea6-4e79-99b1-5954e173d226-node-pullsecrets\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.784161 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e9b2e26-976d-498c-88d8-dbddd520c9bf-audit-dir\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.785143 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.786164 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-etcd-serving-ca\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.786555 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e9b2e26-976d-498c-88d8-dbddd520c9bf-etcd-client\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.786966 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.787361 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd468f98-7720-4f9a-972f-684b96f4f90f-serving-cert\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.798049 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e9b2e26-976d-498c-88d8-dbddd520c9bf-serving-cert\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.804358 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-audit\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.805504 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-machine-approver-tls\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.805648 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9e9b2e26-976d-498c-88d8-dbddd520c9bf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.805656 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-client-ca\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.806016 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5968ec26-dea6-4e79-99b1-5954e173d226-serving-cert\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.806376 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5198f9e2-ae27-4804-ab74-0759a5217d89-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jt5wx\" (UID: \"5198f9e2-ae27-4804-ab74-0759a5217d89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.806578 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/595c8db4-733e-4729-aa34-8be7307043a8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.806690 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.807069 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5968ec26-dea6-4e79-99b1-5954e173d226-etcd-client\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.807336 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/595c8db4-733e-4729-aa34-8be7307043a8-service-ca-bundle\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.807454 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d87932-1993-464d-b3d2-71025526e1f2-trusted-ca\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.808038 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e9b2e26-976d-498c-88d8-dbddd520c9bf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.808178 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.808498 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-config\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.808895 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c1d3a6-23fd-4526-8892-0add23b09a9a-serving-cert\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.809246 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d75st"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.810663 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5968ec26-dea6-4e79-99b1-5954e173d226-image-import-ca\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.809393 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/df659e7d-39ab-41ee-8df5-08896976666c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.809458 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.809654 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb18a52e-b1db-406b-a2e8-88a1ae8b05fc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2kgbd\" (UID: \"bb18a52e-b1db-406b-a2e8-88a1ae8b05fc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.809720 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.809998 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/595c8db4-733e-4729-aa34-8be7307043a8-config\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.810215 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2d87932-1993-464d-b3d2-71025526e1f2-serving-cert\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.810246 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9e9b2e26-976d-498c-88d8-dbddd520c9bf-encryption-config\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.810441 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/595c8db4-733e-4729-aa34-8be7307043a8-serving-cert\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.809327 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p97g8" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.811682 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.811717 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.811730 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.811744 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-86hhq"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.811758 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.811819 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.812172 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.814561 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jsc24"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.814589 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qmrn5"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.815011 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vklwp"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.817615 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7hhjq"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.819023 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p97g8"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.819895 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.820034 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d75st"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.820884 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mv87q"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.822248 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mv87q"] Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.822346 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.829733 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.850464 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862432 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-service-ca\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862471 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79bb3a49-346f-49b7-bb8e-c358105f8035-config\") pod \"kube-apiserver-operator-766d6c64bb-f24fn\" (UID: \"79bb3a49-346f-49b7-bb8e-c358105f8035\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862498 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldgn9\" (UniqueName: \"kubernetes.io/projected/ad63def1-18c4-4841-a936-b7c7e42ce092-kube-api-access-ldgn9\") pod \"service-ca-operator-777779d784-wfq9w\" (UID: \"ad63def1-18c4-4841-a936-b7c7e42ce092\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862520 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ggch\" (UniqueName: \"kubernetes.io/projected/10aa25f4-7549-468a-b42f-19305ad066dd-kube-api-access-9ggch\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862542 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f60efd0-54f5-43eb-b824-f8eaa836df60-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9hwg5\" (UID: \"6f60efd0-54f5-43eb-b824-f8eaa836df60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862581 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79bb3a49-346f-49b7-bb8e-c358105f8035-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f24fn\" (UID: \"79bb3a49-346f-49b7-bb8e-c358105f8035\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862602 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/06e4a751-614f-49d2-8246-c76419d1ccb4-profile-collector-cert\") pod \"catalog-operator-68c6474976-dg6jw\" (UID: \"06e4a751-614f-49d2-8246-c76419d1ccb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862620 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-trusted-ca-bundle\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862641 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12ef881d-885a-4215-bd57-27966cb209b8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862660 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b881e81-67ed-4c33-a992-da59d7996b9d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bxfwb\" (UID: \"4b881e81-67ed-4c33-a992-da59d7996b9d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862679 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj2jk\" (UniqueName: \"kubernetes.io/projected/de77b9aa-b558-4431-b116-5e1e1cc116f3-kube-api-access-nj2jk\") pod \"collect-profiles-29524905-jqdfw\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862700 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/34f3caca-1b4c-493d-a10b-277b42d7ce72-stats-auth\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862910 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12ef881d-885a-4215-bd57-27966cb209b8-images\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862927 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34f3caca-1b4c-493d-a10b-277b42d7ce72-metrics-certs\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862947 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gczv\" (UniqueName: \"kubernetes.io/projected/e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e-kube-api-access-2gczv\") pod \"downloads-7954f5f757-9pq25\" (UID: \"e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e\") " pod="openshift-console/downloads-7954f5f757-9pq25" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862965 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4gld\" (UniqueName: \"kubernetes.io/projected/06e4a751-614f-49d2-8246-c76419d1ccb4-kube-api-access-j4gld\") pod \"catalog-operator-68c6474976-dg6jw\" (UID: \"06e4a751-614f-49d2-8246-c76419d1ccb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.862985 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3c4f7134-312f-4f1d-a344-80d44d65c371-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gk5mg\" (UID: \"3c4f7134-312f-4f1d-a344-80d44d65c371\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863004 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad63def1-18c4-4841-a936-b7c7e42ce092-serving-cert\") pod \"service-ca-operator-777779d784-wfq9w\" (UID: \"ad63def1-18c4-4841-a936-b7c7e42ce092\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863030 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/34f3caca-1b4c-493d-a10b-277b42d7ce72-default-certificate\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863049 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lqpt\" (UniqueName: \"kubernetes.io/projected/3c4f7134-312f-4f1d-a344-80d44d65c371-kube-api-access-5lqpt\") pod \"openshift-config-operator-7777fb866f-gk5mg\" (UID: \"3c4f7134-312f-4f1d-a344-80d44d65c371\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863077 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-oauth-config\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863097 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwrbb\" (UniqueName: \"kubernetes.io/projected/57d54c43-611a-40f1-b05e-9a0007dbe3ec-kube-api-access-hwrbb\") pod \"migrator-59844c95c7-5tt6k\" (UID: \"57d54c43-611a-40f1-b05e-9a0007dbe3ec\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863154 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d639ff25-343e-4e7c-bd2e-f5fc533923f4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-s67xb\" (UID: \"d639ff25-343e-4e7c-bd2e-f5fc533923f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863176 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de77b9aa-b558-4431-b116-5e1e1cc116f3-config-volume\") pod \"collect-profiles-29524905-jqdfw\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863198 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fz4c\" (UniqueName: \"kubernetes.io/projected/d639ff25-343e-4e7c-bd2e-f5fc533923f4-kube-api-access-2fz4c\") pod \"control-plane-machine-set-operator-78cbb6b69f-s67xb\" (UID: \"d639ff25-343e-4e7c-bd2e-f5fc533923f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863219 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-serving-cert\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863240 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad63def1-18c4-4841-a936-b7c7e42ce092-config\") pod \"service-ca-operator-777779d784-wfq9w\" (UID: \"ad63def1-18c4-4841-a936-b7c7e42ce092\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863257 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34f3caca-1b4c-493d-a10b-277b42d7ce72-service-ca-bundle\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863285 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzhg7\" (UniqueName: \"kubernetes.io/projected/5288b888-1b48-4590-8d10-f3688ba87a41-kube-api-access-rzhg7\") pod \"multus-admission-controller-857f4d67dd-lkp4m\" (UID: \"5288b888-1b48-4590-8d10-f3688ba87a41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863305 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4f7134-312f-4f1d-a344-80d44d65c371-serving-cert\") pod \"openshift-config-operator-7777fb866f-gk5mg\" (UID: \"3c4f7134-312f-4f1d-a344-80d44d65c371\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863334 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpwks\" (UniqueName: \"kubernetes.io/projected/12ef881d-885a-4215-bd57-27966cb209b8-kube-api-access-fpwks\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863354 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48911b55-fb42-412b-9298-4cba1105a164-config\") pod \"kube-controller-manager-operator-78b949d7b-hs5fr\" (UID: \"48911b55-fb42-412b-9298-4cba1105a164\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863372 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rhkq\" (UniqueName: \"kubernetes.io/projected/34f3caca-1b4c-493d-a10b-277b42d7ce72-kube-api-access-8rhkq\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863394 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79bb3a49-346f-49b7-bb8e-c358105f8035-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f24fn\" (UID: \"79bb3a49-346f-49b7-bb8e-c358105f8035\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863426 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48911b55-fb42-412b-9298-4cba1105a164-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hs5fr\" (UID: \"48911b55-fb42-412b-9298-4cba1105a164\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863445 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5288b888-1b48-4590-8d10-f3688ba87a41-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lkp4m\" (UID: \"5288b888-1b48-4590-8d10-f3688ba87a41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863465 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de77b9aa-b558-4431-b116-5e1e1cc116f3-secret-volume\") pod \"collect-profiles-29524905-jqdfw\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863498 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgcd\" (UniqueName: \"kubernetes.io/projected/6f60efd0-54f5-43eb-b824-f8eaa836df60-kube-api-access-rkgcd\") pod \"cluster-samples-operator-665b6dd947-9hwg5\" (UID: \"6f60efd0-54f5-43eb-b824-f8eaa836df60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863538 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-console-config\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863558 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/06e4a751-614f-49d2-8246-c76419d1ccb4-srv-cert\") pod \"catalog-operator-68c6474976-dg6jw\" (UID: \"06e4a751-614f-49d2-8246-c76419d1ccb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863586 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-oauth-serving-cert\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863615 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12ef881d-885a-4215-bd57-27966cb209b8-proxy-tls\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863635 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b881e81-67ed-4c33-a992-da59d7996b9d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bxfwb\" (UID: \"4b881e81-67ed-4c33-a992-da59d7996b9d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863659 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48911b55-fb42-412b-9298-4cba1105a164-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hs5fr\" (UID: \"48911b55-fb42-412b-9298-4cba1105a164\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.863714 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff89g\" (UniqueName: \"kubernetes.io/projected/4b881e81-67ed-4c33-a992-da59d7996b9d-kube-api-access-ff89g\") pod \"kube-storage-version-migrator-operator-b67b599dd-bxfwb\" (UID: \"4b881e81-67ed-4c33-a992-da59d7996b9d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.864622 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-service-ca\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.865336 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79bb3a49-346f-49b7-bb8e-c358105f8035-config\") pod \"kube-apiserver-operator-766d6c64bb-f24fn\" (UID: \"79bb3a49-346f-49b7-bb8e-c358105f8035\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.868285 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f60efd0-54f5-43eb-b824-f8eaa836df60-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9hwg5\" (UID: \"6f60efd0-54f5-43eb-b824-f8eaa836df60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.869376 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-trusted-ca-bundle\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.870486 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3c4f7134-312f-4f1d-a344-80d44d65c371-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gk5mg\" (UID: \"3c4f7134-312f-4f1d-a344-80d44d65c371\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.871139 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12ef881d-885a-4215-bd57-27966cb209b8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.871956 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48911b55-fb42-412b-9298-4cba1105a164-config\") pod \"kube-controller-manager-operator-78b949d7b-hs5fr\" (UID: \"48911b55-fb42-412b-9298-4cba1105a164\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.872028 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4f7134-312f-4f1d-a344-80d44d65c371-serving-cert\") pod \"openshift-config-operator-7777fb866f-gk5mg\" (UID: \"3c4f7134-312f-4f1d-a344-80d44d65c371\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.872748 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-console-config\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.873461 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-oauth-serving-cert\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.873696 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-serving-cert\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.875995 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48911b55-fb42-412b-9298-4cba1105a164-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hs5fr\" (UID: \"48911b55-fb42-412b-9298-4cba1105a164\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.876492 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79bb3a49-346f-49b7-bb8e-c358105f8035-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f24fn\" (UID: \"79bb3a49-346f-49b7-bb8e-c358105f8035\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.876870 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-oauth-config\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.882653 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.889271 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.895919 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d639ff25-343e-4e7c-bd2e-f5fc533923f4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-s67xb\" (UID: \"d639ff25-343e-4e7c-bd2e-f5fc533923f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.909756 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.929872 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.936221 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b881e81-67ed-4c33-a992-da59d7996b9d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bxfwb\" (UID: \"4b881e81-67ed-4c33-a992-da59d7996b9d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.950483 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.969612 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.972449 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b881e81-67ed-4c33-a992-da59d7996b9d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bxfwb\" (UID: \"4b881e81-67ed-4c33-a992-da59d7996b9d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" Feb 19 09:47:10 crc kubenswrapper[4873]: I0219 09:47:10.989689 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.010380 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.030391 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.050328 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.070686 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.077657 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/06e4a751-614f-49d2-8246-c76419d1ccb4-srv-cert\") pod \"catalog-operator-68c6474976-dg6jw\" (UID: \"06e4a751-614f-49d2-8246-c76419d1ccb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.090893 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.094846 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de77b9aa-b558-4431-b116-5e1e1cc116f3-secret-volume\") pod \"collect-profiles-29524905-jqdfw\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.102844 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/06e4a751-614f-49d2-8246-c76419d1ccb4-profile-collector-cert\") pod \"catalog-operator-68c6474976-dg6jw\" (UID: \"06e4a751-614f-49d2-8246-c76419d1ccb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.109983 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.130713 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.137337 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5288b888-1b48-4590-8d10-f3688ba87a41-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lkp4m\" (UID: \"5288b888-1b48-4590-8d10-f3688ba87a41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.150163 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.170156 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.189956 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.210299 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.219191 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad63def1-18c4-4841-a936-b7c7e42ce092-serving-cert\") pod \"service-ca-operator-777779d784-wfq9w\" (UID: \"ad63def1-18c4-4841-a936-b7c7e42ce092\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.230555 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.240737 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad63def1-18c4-4841-a936-b7c7e42ce092-config\") pod \"service-ca-operator-777779d784-wfq9w\" (UID: \"ad63def1-18c4-4841-a936-b7c7e42ce092\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.251493 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.271396 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.290540 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.296416 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/34f3caca-1b4c-493d-a10b-277b42d7ce72-stats-auth\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.310422 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.315965 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34f3caca-1b4c-493d-a10b-277b42d7ce72-metrics-certs\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.331347 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.350604 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.359321 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/34f3caca-1b4c-493d-a10b-277b42d7ce72-default-certificate\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.370823 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.390169 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.401618 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34f3caca-1b4c-493d-a10b-277b42d7ce72-service-ca-bundle\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.410758 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.415391 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de77b9aa-b558-4431-b116-5e1e1cc116f3-config-volume\") pod \"collect-profiles-29524905-jqdfw\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.430741 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.450677 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.452282 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12ef881d-885a-4215-bd57-27966cb209b8-images\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.471380 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.490694 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.498735 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12ef881d-885a-4215-bd57-27966cb209b8-proxy-tls\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.522532 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.530819 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.551193 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.570093 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.590502 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.610838 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.630909 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.651851 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.671805 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.691529 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.710325 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.729073 4873 request.go:700] Waited for 1.015780616s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns-operator/secrets?fieldSelector=metadata.name%3Dmetrics-tls&limit=500&resourceVersion=0 Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.731745 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.750834 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.770255 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.802412 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.810901 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.830794 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.850771 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.871362 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.892734 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.911313 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.931553 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.950911 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.970783 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 09:47:11 crc kubenswrapper[4873]: I0219 09:47:11.989931 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.012066 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.031823 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.050475 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.070473 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.104552 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.110725 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.131002 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.150840 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.171728 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.191091 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.211010 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.231834 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.251399 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.271571 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.291688 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.310991 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.331271 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.351562 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.371781 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.391475 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.439988 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpxsx\" (UniqueName: \"kubernetes.io/projected/9324aa8b-fbce-42bb-b339-0aa2e382efd4-kube-api-access-fpxsx\") pod \"oauth-openshift-558db77b4-4g545\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.458545 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltjkk\" (UniqueName: \"kubernetes.io/projected/3300ef2b-adb8-4aea-b8ef-cdec19d504b3-kube-api-access-ltjkk\") pod \"machine-approver-56656f9798-r7gp2\" (UID: \"3300ef2b-adb8-4aea-b8ef-cdec19d504b3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.483407 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbm8s\" (UniqueName: \"kubernetes.io/projected/df659e7d-39ab-41ee-8df5-08896976666c-kube-api-access-tbm8s\") pod \"machine-api-operator-5694c8668f-k627b\" (UID: \"df659e7d-39ab-41ee-8df5-08896976666c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.500875 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd4sz\" (UniqueName: \"kubernetes.io/projected/5198f9e2-ae27-4804-ab74-0759a5217d89-kube-api-access-gd4sz\") pod \"openshift-apiserver-operator-796bbdcf4f-jt5wx\" (UID: \"5198f9e2-ae27-4804-ab74-0759a5217d89\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.509429 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.518840 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmb7k\" (UniqueName: \"kubernetes.io/projected/c8c1d3a6-23fd-4526-8892-0add23b09a9a-kube-api-access-mmb7k\") pod \"controller-manager-879f6c89f-qvxgz\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.528295 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll4wz\" (UniqueName: \"kubernetes.io/projected/bd468f98-7720-4f9a-972f-684b96f4f90f-kube-api-access-ll4wz\") pod \"route-controller-manager-6576b87f9c-qltqp\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.529426 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.536526 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.547160 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l9hc\" (UniqueName: \"kubernetes.io/projected/9e9b2e26-976d-498c-88d8-dbddd520c9bf-kube-api-access-5l9hc\") pod \"apiserver-7bbb656c7d-7bgm9\" (UID: \"9e9b2e26-976d-498c-88d8-dbddd520c9bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.558345 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.565913 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jkhf\" (UniqueName: \"kubernetes.io/projected/595c8db4-733e-4729-aa34-8be7307043a8-kube-api-access-6jkhf\") pod \"authentication-operator-69f744f599-jsc24\" (UID: \"595c8db4-733e-4729-aa34-8be7307043a8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.587858 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfvdj\" (UniqueName: \"kubernetes.io/projected/bb18a52e-b1db-406b-a2e8-88a1ae8b05fc-kube-api-access-nfvdj\") pod \"openshift-controller-manager-operator-756b6f6bc6-2kgbd\" (UID: \"bb18a52e-b1db-406b-a2e8-88a1ae8b05fc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.606973 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.607012 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csvqh\" (UniqueName: \"kubernetes.io/projected/b2d87932-1993-464d-b3d2-71025526e1f2-kube-api-access-csvqh\") pod \"console-operator-58897d9998-dxcz7\" (UID: \"b2d87932-1993-464d-b3d2-71025526e1f2\") " pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.629544 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.636332 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r54wb\" (UniqueName: \"kubernetes.io/projected/5968ec26-dea6-4e79-99b1-5954e173d226-kube-api-access-r54wb\") pod \"apiserver-76f77b778f-gbzll\" (UID: \"5968ec26-dea6-4e79-99b1-5954e173d226\") " pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.650983 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.670786 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.690381 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.702269 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.710319 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.730478 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.748399 4873 request.go:700] Waited for 1.936356801s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.751130 4873 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.751362 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.766510 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.770675 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.798794 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.811413 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.833238 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.851061 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.889847 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff89g\" (UniqueName: \"kubernetes.io/projected/4b881e81-67ed-4c33-a992-da59d7996b9d-kube-api-access-ff89g\") pod \"kube-storage-version-migrator-operator-b67b599dd-bxfwb\" (UID: \"4b881e81-67ed-4c33-a992-da59d7996b9d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.912767 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldgn9\" (UniqueName: \"kubernetes.io/projected/ad63def1-18c4-4841-a936-b7c7e42ce092-kube-api-access-ldgn9\") pod \"service-ca-operator-777779d784-wfq9w\" (UID: \"ad63def1-18c4-4841-a936-b7c7e42ce092\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.924170 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.930386 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ggch\" (UniqueName: \"kubernetes.io/projected/10aa25f4-7549-468a-b42f-19305ad066dd-kube-api-access-9ggch\") pod \"console-f9d7485db-shnwj\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.943667 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.950666 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79bb3a49-346f-49b7-bb8e-c358105f8035-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f24fn\" (UID: \"79bb3a49-346f-49b7-bb8e-c358105f8035\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.964015 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzhg7\" (UniqueName: \"kubernetes.io/projected/5288b888-1b48-4590-8d10-f3688ba87a41-kube-api-access-rzhg7\") pod \"multus-admission-controller-857f4d67dd-lkp4m\" (UID: \"5288b888-1b48-4590-8d10-f3688ba87a41\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.970241 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd"] Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.981398 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.994976 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj2jk\" (UniqueName: \"kubernetes.io/projected/de77b9aa-b558-4431-b116-5e1e1cc116f3-kube-api-access-nj2jk\") pod \"collect-profiles-29524905-jqdfw\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:12 crc kubenswrapper[4873]: I0219 09:47:12.996259 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qvxgz"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.002288 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g545"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.005882 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-k627b"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.007539 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gczv\" (UniqueName: \"kubernetes.io/projected/e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e-kube-api-access-2gczv\") pod \"downloads-7954f5f757-9pq25\" (UID: \"e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e\") " pod="openshift-console/downloads-7954f5f757-9pq25" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.011490 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" Feb 19 09:47:13 crc kubenswrapper[4873]: W0219 09:47:13.018263 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8c1d3a6_23fd_4526_8892_0add23b09a9a.slice/crio-3919526da5da79321b05444b65501cd491975ca30007c3620a85b734545d5c95 WatchSource:0}: Error finding container 3919526da5da79321b05444b65501cd491975ca30007c3620a85b734545d5c95: Status 404 returned error can't find the container with id 3919526da5da79321b05444b65501cd491975ca30007c3620a85b734545d5c95 Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.024720 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" Feb 19 09:47:13 crc kubenswrapper[4873]: W0219 09:47:13.025028 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb18a52e_b1db_406b_a2e8_88a1ae8b05fc.slice/crio-9f71a4f3f66d0666fb1b17346581f9f693cb4bf9056801187b3c9e9a209ec54e WatchSource:0}: Error finding container 9f71a4f3f66d0666fb1b17346581f9f693cb4bf9056801187b3c9e9a209ec54e: Status 404 returned error can't find the container with id 9f71a4f3f66d0666fb1b17346581f9f693cb4bf9056801187b3c9e9a209ec54e Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.026140 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4gld\" (UniqueName: \"kubernetes.io/projected/06e4a751-614f-49d2-8246-c76419d1ccb4-kube-api-access-j4gld\") pod \"catalog-operator-68c6474976-dg6jw\" (UID: \"06e4a751-614f-49d2-8246-c76419d1ccb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.031859 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.047392 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.050746 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpwks\" (UniqueName: \"kubernetes.io/projected/12ef881d-885a-4215-bd57-27966cb209b8-kube-api-access-fpwks\") pod \"machine-config-operator-74547568cd-pzspc\" (UID: \"12ef881d-885a-4215-bd57-27966cb209b8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.055117 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.082931 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rhkq\" (UniqueName: \"kubernetes.io/projected/34f3caca-1b4c-493d-a10b-277b42d7ce72-kube-api-access-8rhkq\") pod \"router-default-5444994796-kzpbf\" (UID: \"34f3caca-1b4c-493d-a10b-277b42d7ce72\") " pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.090890 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgcd\" (UniqueName: \"kubernetes.io/projected/6f60efd0-54f5-43eb-b824-f8eaa836df60-kube-api-access-rkgcd\") pod \"cluster-samples-operator-665b6dd947-9hwg5\" (UID: \"6f60efd0-54f5-43eb-b824-f8eaa836df60\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.095410 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.107528 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48911b55-fb42-412b-9298-4cba1105a164-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hs5fr\" (UID: \"48911b55-fb42-412b-9298-4cba1105a164\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.108802 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jsc24"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.109904 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.127281 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwrbb\" (UniqueName: \"kubernetes.io/projected/57d54c43-611a-40f1-b05e-9a0007dbe3ec-kube-api-access-hwrbb\") pod \"migrator-59844c95c7-5tt6k\" (UID: \"57d54c43-611a-40f1-b05e-9a0007dbe3ec\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.153461 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lqpt\" (UniqueName: \"kubernetes.io/projected/3c4f7134-312f-4f1d-a344-80d44d65c371-kube-api-access-5lqpt\") pod \"openshift-config-operator-7777fb866f-gk5mg\" (UID: \"3c4f7134-312f-4f1d-a344-80d44d65c371\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.163438 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gbzll"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.173541 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fz4c\" (UniqueName: \"kubernetes.io/projected/d639ff25-343e-4e7c-bd2e-f5fc533923f4-kube-api-access-2fz4c\") pod \"control-plane-machine-set-operator-78cbb6b69f-s67xb\" (UID: \"d639ff25-343e-4e7c-bd2e-f5fc533923f4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.206951 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-86hhq\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.207016 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40382b72-88a7-4f37-9192-a555a259d4bd-serving-cert\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.207038 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b283da7-d736-4ac2-a290-e142728e838a-srv-cert\") pod \"olm-operator-6b444d44fb-2b5f5\" (UID: \"5b283da7-d736-4ac2-a290-e142728e838a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.207089 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21aad9a0-00de-4f42-9923-6c66c79a3a8d-proxy-tls\") pod \"machine-config-controller-84d6567774-24gcv\" (UID: \"21aad9a0-00de-4f42-9923-6c66c79a3a8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.207430 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.207473 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4fmb\" (UniqueName: \"kubernetes.io/projected/7a3637cc-cfef-446c-b0fb-f37f3396e0d7-kube-api-access-b4fmb\") pod \"machine-config-server-2798g\" (UID: \"7a3637cc-cfef-446c-b0fb-f37f3396e0d7\") " pod="openshift-machine-config-operator/machine-config-server-2798g" Feb 19 09:47:13 crc kubenswrapper[4873]: E0219 09:47:13.207931 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:13.707917592 +0000 UTC m=+142.997349230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:13 crc kubenswrapper[4873]: W0219 09:47:13.208210 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5968ec26_dea6_4e79_99b1_5954e173d226.slice/crio-8af950b8e44dbf5989eeba42ba1cc61d77c347933b88af6cc094181042dc3724 WatchSource:0}: Error finding container 8af950b8e44dbf5989eeba42ba1cc61d77c347933b88af6cc094181042dc3724: Status 404 returned error can't find the container with id 8af950b8e44dbf5989eeba42ba1cc61d77c347933b88af6cc094181042dc3724 Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.208338 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5920bdb-afd9-401e-8f11-108a90660e1c-metrics-tls\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.208430 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c6tw\" (UniqueName: \"kubernetes.io/projected/7671d99c-f025-4e36-b336-106655ec13ef-kube-api-access-5c6tw\") pod \"dns-operator-744455d44c-qmrn5\" (UID: \"7671d99c-f025-4e36-b336-106655ec13ef\") " pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.208850 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2948a5a7-4d94-4314-acdf-489dd93609b9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.208883 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5920bdb-afd9-401e-8f11-108a90660e1c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.208912 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21aad9a0-00de-4f42-9923-6c66c79a3a8d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-24gcv\" (UID: \"21aad9a0-00de-4f42-9923-6c66c79a3a8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.208944 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/40382b72-88a7-4f37-9192-a555a259d4bd-etcd-client\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.209183 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpfg6\" (UniqueName: \"kubernetes.io/projected/829eb540-5f77-4748-a99d-c5bdbd13c26f-kube-api-access-gpfg6\") pod \"service-ca-9c57cc56f-tjxkj\" (UID: \"829eb540-5f77-4748-a99d-c5bdbd13c26f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.209212 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-shnwj"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.209241 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5e97ddb-b404-4ce2-b760-2739c36c755a-tmpfs\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.209262 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e3dce33-cc6d-41b5-ac17-481a98c06373-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.209301 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f69ad03d-7d61-4b31-a556-325751fcba8e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kx6gv\" (UID: \"f69ad03d-7d61-4b31-a556-325751fcba8e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.209437 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f69ad03d-7d61-4b31-a556-325751fcba8e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kx6gv\" (UID: \"f69ad03d-7d61-4b31-a556-325751fcba8e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.209770 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/829eb540-5f77-4748-a99d-c5bdbd13c26f-signing-key\") pod \"service-ca-9c57cc56f-tjxkj\" (UID: \"829eb540-5f77-4748-a99d-c5bdbd13c26f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.210397 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-tls\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.210428 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40382b72-88a7-4f37-9192-a555a259d4bd-config\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.210535 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf2de5cd-4280-4c0c-9276-b693a51986b7-cert\") pod \"ingress-canary-p97g8\" (UID: \"bf2de5cd-4280-4c0c-9276-b693a51986b7\") " pod="openshift-ingress-canary/ingress-canary-p97g8" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.210597 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bf2ad48-6696-4f08-adc8-330fd4c25028-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jjkrt\" (UID: \"4bf2ad48-6696-4f08-adc8-330fd4c25028\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.210896 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f69ad03d-7d61-4b31-a556-325751fcba8e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kx6gv\" (UID: \"f69ad03d-7d61-4b31-a556-325751fcba8e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.211186 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-certificates\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.211425 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5e97ddb-b404-4ce2-b760-2739c36c755a-webhook-cert\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.211453 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/40382b72-88a7-4f37-9192-a555a259d4bd-etcd-service-ca\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.211532 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slpss\" (UniqueName: \"kubernetes.io/projected/40382b72-88a7-4f37-9192-a555a259d4bd-kube-api-access-slpss\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.211821 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/829eb540-5f77-4748-a99d-c5bdbd13c26f-signing-cabundle\") pod \"service-ca-9c57cc56f-tjxkj\" (UID: \"829eb540-5f77-4748-a99d-c5bdbd13c26f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.211850 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdbm8\" (UniqueName: \"kubernetes.io/projected/21aad9a0-00de-4f42-9923-6c66c79a3a8d-kube-api-access-xdbm8\") pod \"machine-config-controller-84d6567774-24gcv\" (UID: \"21aad9a0-00de-4f42-9923-6c66c79a3a8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.211911 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7wm6\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-kube-api-access-q7wm6\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.212576 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b283da7-d736-4ac2-a290-e142728e838a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2b5f5\" (UID: \"5b283da7-d736-4ac2-a290-e142728e838a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.212598 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2948a5a7-4d94-4314-acdf-489dd93609b9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.212874 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-86hhq\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.212895 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t66n7\" (UniqueName: \"kubernetes.io/projected/f905b5ea-71df-4b1c-997c-d68766bcfcfe-kube-api-access-t66n7\") pod \"marketplace-operator-79b997595-86hhq\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.213166 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5e97ddb-b404-4ce2-b760-2739c36c755a-apiservice-cert\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.213340 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sstl4\" (UniqueName: \"kubernetes.io/projected/bf2de5cd-4280-4c0c-9276-b693a51986b7-kube-api-access-sstl4\") pod \"ingress-canary-p97g8\" (UID: \"bf2de5cd-4280-4c0c-9276-b693a51986b7\") " pod="openshift-ingress-canary/ingress-canary-p97g8" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.213369 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8h4w\" (UniqueName: \"kubernetes.io/projected/4bf2ad48-6696-4f08-adc8-330fd4c25028-kube-api-access-c8h4w\") pod \"package-server-manager-789f6589d5-jjkrt\" (UID: \"4bf2ad48-6696-4f08-adc8-330fd4c25028\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.213645 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7a3637cc-cfef-446c-b0fb-f37f3396e0d7-certs\") pod \"machine-config-server-2798g\" (UID: \"7a3637cc-cfef-446c-b0fb-f37f3396e0d7\") " pod="openshift-machine-config-operator/machine-config-server-2798g" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.213770 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e3dce33-cc6d-41b5-ac17-481a98c06373-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.214002 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-trusted-ca\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.214090 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-bound-sa-token\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.214268 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/40382b72-88a7-4f37-9192-a555a259d4bd-etcd-ca\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.214348 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5920bdb-afd9-401e-8f11-108a90660e1c-trusted-ca\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.214792 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm7zn\" (UniqueName: \"kubernetes.io/projected/c5e97ddb-b404-4ce2-b760-2739c36c755a-kube-api-access-tm7zn\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.215003 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4xtz\" (UniqueName: \"kubernetes.io/projected/a5920bdb-afd9-401e-8f11-108a90660e1c-kube-api-access-p4xtz\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.215035 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7a3637cc-cfef-446c-b0fb-f37f3396e0d7-node-bootstrap-token\") pod \"machine-config-server-2798g\" (UID: \"7a3637cc-cfef-446c-b0fb-f37f3396e0d7\") " pod="openshift-machine-config-operator/machine-config-server-2798g" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.215192 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8trrg\" (UniqueName: \"kubernetes.io/projected/2e3dce33-cc6d-41b5-ac17-481a98c06373-kube-api-access-8trrg\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.215237 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-478rd\" (UniqueName: \"kubernetes.io/projected/5b283da7-d736-4ac2-a290-e142728e838a-kube-api-access-478rd\") pod \"olm-operator-6b444d44fb-2b5f5\" (UID: \"5b283da7-d736-4ac2-a290-e142728e838a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.215302 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7671d99c-f025-4e36-b336-106655ec13ef-metrics-tls\") pod \"dns-operator-744455d44c-qmrn5\" (UID: \"7671d99c-f025-4e36-b336-106655ec13ef\") " pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.216310 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e3dce33-cc6d-41b5-ac17-481a98c06373-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.237527 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.242089 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" event={"ID":"bb18a52e-b1db-406b-a2e8-88a1ae8b05fc","Type":"ContainerStarted","Data":"9f71a4f3f66d0666fb1b17346581f9f693cb4bf9056801187b3c9e9a209ec54e"} Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.248154 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.249168 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" event={"ID":"9324aa8b-fbce-42bb-b339-0aa2e382efd4","Type":"ContainerStarted","Data":"b4cafb3addf61abe3b1441fa50a8321f11c79cf993ea43c1a09c9c8ca90fbdfc"} Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.253426 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" event={"ID":"df659e7d-39ab-41ee-8df5-08896976666c","Type":"ContainerStarted","Data":"996479731ccd4ae0ac95d94c3bc6866a40d655d2e667a16ece349b1dbabe65e8"} Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.253516 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.256472 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dxcz7"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.260290 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.263585 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" event={"ID":"3300ef2b-adb8-4aea-b8ef-cdec19d504b3","Type":"ContainerStarted","Data":"a238776f2ad55cef9e1e6259d58b9ebc7c330639905481a3404c60dd5028fe51"} Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.263620 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" event={"ID":"3300ef2b-adb8-4aea-b8ef-cdec19d504b3","Type":"ContainerStarted","Data":"ce636371d8d8e9e398cd742b48ffdca9fd56e2e90cf89e766fb4420d1d841433"} Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.263630 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" event={"ID":"3300ef2b-adb8-4aea-b8ef-cdec19d504b3","Type":"ContainerStarted","Data":"751e435e40fd0f90bb69689c922fbfeccee4e0004e8903532e7dc95dab50d8db"} Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.275367 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" event={"ID":"c8c1d3a6-23fd-4526-8892-0add23b09a9a","Type":"ContainerStarted","Data":"3919526da5da79321b05444b65501cd491975ca30007c3620a85b734545d5c95"} Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.282183 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" event={"ID":"5968ec26-dea6-4e79-99b1-5954e173d226","Type":"ContainerStarted","Data":"8af950b8e44dbf5989eeba42ba1cc61d77c347933b88af6cc094181042dc3724"} Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.283502 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" event={"ID":"9e9b2e26-976d-498c-88d8-dbddd520c9bf","Type":"ContainerStarted","Data":"a151c797426fa44538a258538af785a475d0df407621ec0f555080c653e8112f"} Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.288122 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" event={"ID":"595c8db4-733e-4729-aa34-8be7307043a8","Type":"ContainerStarted","Data":"d9f1af9debc32423a5b203643faad3d9624bbdf1a35fb0469cf8556ee6523f59"} Feb 19 09:47:13 crc kubenswrapper[4873]: W0219 09:47:13.292560 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2d87932_1993_464d_b3d2_71025526e1f2.slice/crio-52c8c7565f7e76e2b1be9ed7ec38817313b26529cc4216d67909c7d5b38aadf8 WatchSource:0}: Error finding container 52c8c7565f7e76e2b1be9ed7ec38817313b26529cc4216d67909c7d5b38aadf8: Status 404 returned error can't find the container with id 52c8c7565f7e76e2b1be9ed7ec38817313b26529cc4216d67909c7d5b38aadf8 Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.293051 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9pq25" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.298289 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.304734 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318375 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318546 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318697 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf2de5cd-4280-4c0c-9276-b693a51986b7-cert\") pod \"ingress-canary-p97g8\" (UID: \"bf2de5cd-4280-4c0c-9276-b693a51986b7\") " pod="openshift-ingress-canary/ingress-canary-p97g8" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318727 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bf2ad48-6696-4f08-adc8-330fd4c25028-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jjkrt\" (UID: \"4bf2ad48-6696-4f08-adc8-330fd4c25028\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318753 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f69ad03d-7d61-4b31-a556-325751fcba8e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kx6gv\" (UID: \"f69ad03d-7d61-4b31-a556-325751fcba8e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318801 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-plugins-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318825 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-certificates\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318843 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5e97ddb-b404-4ce2-b760-2739c36c755a-webhook-cert\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318862 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/40382b72-88a7-4f37-9192-a555a259d4bd-etcd-service-ca\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318885 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slpss\" (UniqueName: \"kubernetes.io/projected/40382b72-88a7-4f37-9192-a555a259d4bd-kube-api-access-slpss\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318903 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/829eb540-5f77-4748-a99d-c5bdbd13c26f-signing-cabundle\") pod \"service-ca-9c57cc56f-tjxkj\" (UID: \"829eb540-5f77-4748-a99d-c5bdbd13c26f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318926 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbm8\" (UniqueName: \"kubernetes.io/projected/21aad9a0-00de-4f42-9923-6c66c79a3a8d-kube-api-access-xdbm8\") pod \"machine-config-controller-84d6567774-24gcv\" (UID: \"21aad9a0-00de-4f42-9923-6c66c79a3a8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318947 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7wm6\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-kube-api-access-q7wm6\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318966 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b283da7-d736-4ac2-a290-e142728e838a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2b5f5\" (UID: \"5b283da7-d736-4ac2-a290-e142728e838a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.318984 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2948a5a7-4d94-4314-acdf-489dd93609b9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319002 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-mountpoint-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319018 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-86hhq\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319033 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/176cb3ad-1201-420f-bdb2-586f974aeaf2-config-volume\") pod \"dns-default-mv87q\" (UID: \"176cb3ad-1201-420f-bdb2-586f974aeaf2\") " pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319049 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t66n7\" (UniqueName: \"kubernetes.io/projected/f905b5ea-71df-4b1c-997c-d68766bcfcfe-kube-api-access-t66n7\") pod \"marketplace-operator-79b997595-86hhq\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319064 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5e97ddb-b404-4ce2-b760-2739c36c755a-apiservice-cert\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319080 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sstl4\" (UniqueName: \"kubernetes.io/projected/bf2de5cd-4280-4c0c-9276-b693a51986b7-kube-api-access-sstl4\") pod \"ingress-canary-p97g8\" (UID: \"bf2de5cd-4280-4c0c-9276-b693a51986b7\") " pod="openshift-ingress-canary/ingress-canary-p97g8" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319095 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8h4w\" (UniqueName: \"kubernetes.io/projected/4bf2ad48-6696-4f08-adc8-330fd4c25028-kube-api-access-c8h4w\") pod \"package-server-manager-789f6589d5-jjkrt\" (UID: \"4bf2ad48-6696-4f08-adc8-330fd4c25028\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319154 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7a3637cc-cfef-446c-b0fb-f37f3396e0d7-certs\") pod \"machine-config-server-2798g\" (UID: \"7a3637cc-cfef-446c-b0fb-f37f3396e0d7\") " pod="openshift-machine-config-operator/machine-config-server-2798g" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319175 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e3dce33-cc6d-41b5-ac17-481a98c06373-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319197 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-trusted-ca\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319217 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-bound-sa-token\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319239 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/40382b72-88a7-4f37-9192-a555a259d4bd-etcd-ca\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319262 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5920bdb-afd9-401e-8f11-108a90660e1c-trusted-ca\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319284 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm7zn\" (UniqueName: \"kubernetes.io/projected/c5e97ddb-b404-4ce2-b760-2739c36c755a-kube-api-access-tm7zn\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319307 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4xtz\" (UniqueName: \"kubernetes.io/projected/a5920bdb-afd9-401e-8f11-108a90660e1c-kube-api-access-p4xtz\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319325 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7a3637cc-cfef-446c-b0fb-f37f3396e0d7-node-bootstrap-token\") pod \"machine-config-server-2798g\" (UID: \"7a3637cc-cfef-446c-b0fb-f37f3396e0d7\") " pod="openshift-machine-config-operator/machine-config-server-2798g" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319348 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8trrg\" (UniqueName: \"kubernetes.io/projected/2e3dce33-cc6d-41b5-ac17-481a98c06373-kube-api-access-8trrg\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319371 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-478rd\" (UniqueName: \"kubernetes.io/projected/5b283da7-d736-4ac2-a290-e142728e838a-kube-api-access-478rd\") pod \"olm-operator-6b444d44fb-2b5f5\" (UID: \"5b283da7-d736-4ac2-a290-e142728e838a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319393 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7671d99c-f025-4e36-b336-106655ec13ef-metrics-tls\") pod \"dns-operator-744455d44c-qmrn5\" (UID: \"7671d99c-f025-4e36-b336-106655ec13ef\") " pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319420 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-registration-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.319452 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-csi-data-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.320177 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e3dce33-cc6d-41b5-ac17-481a98c06373-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.320235 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-86hhq\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.320728 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40382b72-88a7-4f37-9192-a555a259d4bd-serving-cert\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.320761 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b283da7-d736-4ac2-a290-e142728e838a-srv-cert\") pod \"olm-operator-6b444d44fb-2b5f5\" (UID: \"5b283da7-d736-4ac2-a290-e142728e838a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.320812 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21aad9a0-00de-4f42-9923-6c66c79a3a8d-proxy-tls\") pod \"machine-config-controller-84d6567774-24gcv\" (UID: \"21aad9a0-00de-4f42-9923-6c66c79a3a8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.320930 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/176cb3ad-1201-420f-bdb2-586f974aeaf2-metrics-tls\") pod \"dns-default-mv87q\" (UID: \"176cb3ad-1201-420f-bdb2-586f974aeaf2\") " pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.320993 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4fmb\" (UniqueName: \"kubernetes.io/projected/7a3637cc-cfef-446c-b0fb-f37f3396e0d7-kube-api-access-b4fmb\") pod \"machine-config-server-2798g\" (UID: \"7a3637cc-cfef-446c-b0fb-f37f3396e0d7\") " pod="openshift-machine-config-operator/machine-config-server-2798g" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321061 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5920bdb-afd9-401e-8f11-108a90660e1c-metrics-tls\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321090 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t257b\" (UniqueName: \"kubernetes.io/projected/176cb3ad-1201-420f-bdb2-586f974aeaf2-kube-api-access-t257b\") pod \"dns-default-mv87q\" (UID: \"176cb3ad-1201-420f-bdb2-586f974aeaf2\") " pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321164 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c6tw\" (UniqueName: \"kubernetes.io/projected/7671d99c-f025-4e36-b336-106655ec13ef-kube-api-access-5c6tw\") pod \"dns-operator-744455d44c-qmrn5\" (UID: \"7671d99c-f025-4e36-b336-106655ec13ef\") " pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321226 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2948a5a7-4d94-4314-acdf-489dd93609b9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321254 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5920bdb-afd9-401e-8f11-108a90660e1c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321306 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21aad9a0-00de-4f42-9923-6c66c79a3a8d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-24gcv\" (UID: \"21aad9a0-00de-4f42-9923-6c66c79a3a8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321333 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/40382b72-88a7-4f37-9192-a555a259d4bd-etcd-client\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321375 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfjv2\" (UniqueName: \"kubernetes.io/projected/2877ec4c-7a3e-4105-ac87-6d096df10661-kube-api-access-qfjv2\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321664 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpfg6\" (UniqueName: \"kubernetes.io/projected/829eb540-5f77-4748-a99d-c5bdbd13c26f-kube-api-access-gpfg6\") pod \"service-ca-9c57cc56f-tjxkj\" (UID: \"829eb540-5f77-4748-a99d-c5bdbd13c26f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" Feb 19 09:47:13 crc kubenswrapper[4873]: E0219 09:47:13.321683 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:13.821659654 +0000 UTC m=+143.111091292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321717 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5e97ddb-b404-4ce2-b760-2739c36c755a-tmpfs\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321746 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e3dce33-cc6d-41b5-ac17-481a98c06373-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321776 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f69ad03d-7d61-4b31-a556-325751fcba8e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kx6gv\" (UID: \"f69ad03d-7d61-4b31-a556-325751fcba8e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321801 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-socket-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321828 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f69ad03d-7d61-4b31-a556-325751fcba8e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kx6gv\" (UID: \"f69ad03d-7d61-4b31-a556-325751fcba8e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321851 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/829eb540-5f77-4748-a99d-c5bdbd13c26f-signing-key\") pod \"service-ca-9c57cc56f-tjxkj\" (UID: \"829eb540-5f77-4748-a99d-c5bdbd13c26f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321873 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-tls\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.321888 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40382b72-88a7-4f37-9192-a555a259d4bd-config\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.322626 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40382b72-88a7-4f37-9192-a555a259d4bd-config\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.323707 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-86hhq\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.323984 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5e97ddb-b404-4ce2-b760-2739c36c755a-tmpfs\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.324153 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/40382b72-88a7-4f37-9192-a555a259d4bd-etcd-ca\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.325854 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2948a5a7-4d94-4314-acdf-489dd93609b9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.325977 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e3dce33-cc6d-41b5-ac17-481a98c06373-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.327464 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-certificates\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.327787 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5920bdb-afd9-401e-8f11-108a90660e1c-trusted-ca\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.329162 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f69ad03d-7d61-4b31-a556-325751fcba8e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kx6gv\" (UID: \"f69ad03d-7d61-4b31-a556-325751fcba8e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.331836 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/40382b72-88a7-4f37-9192-a555a259d4bd-etcd-service-ca\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.333616 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf2de5cd-4280-4c0c-9276-b693a51986b7-cert\") pod \"ingress-canary-p97g8\" (UID: \"bf2de5cd-4280-4c0c-9276-b693a51986b7\") " pod="openshift-ingress-canary/ingress-canary-p97g8" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.333802 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-trusted-ca\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.333808 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40382b72-88a7-4f37-9192-a555a259d4bd-serving-cert\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.335612 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2948a5a7-4d94-4314-acdf-489dd93609b9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.336095 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7671d99c-f025-4e36-b336-106655ec13ef-metrics-tls\") pod \"dns-operator-744455d44c-qmrn5\" (UID: \"7671d99c-f025-4e36-b336-106655ec13ef\") " pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.336533 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b283da7-d736-4ac2-a290-e142728e838a-srv-cert\") pod \"olm-operator-6b444d44fb-2b5f5\" (UID: \"5b283da7-d736-4ac2-a290-e142728e838a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.337232 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/829eb540-5f77-4748-a99d-c5bdbd13c26f-signing-cabundle\") pod \"service-ca-9c57cc56f-tjxkj\" (UID: \"829eb540-5f77-4748-a99d-c5bdbd13c26f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.337510 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5e97ddb-b404-4ce2-b760-2739c36c755a-webhook-cert\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.337546 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-86hhq\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.338666 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.339241 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5e97ddb-b404-4ce2-b760-2739c36c755a-apiservice-cert\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.339514 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b283da7-d736-4ac2-a290-e142728e838a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2b5f5\" (UID: \"5b283da7-d736-4ac2-a290-e142728e838a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.340523 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7a3637cc-cfef-446c-b0fb-f37f3396e0d7-certs\") pod \"machine-config-server-2798g\" (UID: \"7a3637cc-cfef-446c-b0fb-f37f3396e0d7\") " pod="openshift-machine-config-operator/machine-config-server-2798g" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.340563 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e3dce33-cc6d-41b5-ac17-481a98c06373-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.340602 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/40382b72-88a7-4f37-9192-a555a259d4bd-etcd-client\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.341300 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" event={"ID":"5198f9e2-ae27-4804-ab74-0759a5217d89","Type":"ContainerStarted","Data":"5c3be8f3d5492a9f7a86dbd9963bf3d401162083d542b8b1814d89232233bc6a"} Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.341605 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21aad9a0-00de-4f42-9923-6c66c79a3a8d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-24gcv\" (UID: \"21aad9a0-00de-4f42-9923-6c66c79a3a8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.342358 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21aad9a0-00de-4f42-9923-6c66c79a3a8d-proxy-tls\") pod \"machine-config-controller-84d6567774-24gcv\" (UID: \"21aad9a0-00de-4f42-9923-6c66c79a3a8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.342470 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bf2ad48-6696-4f08-adc8-330fd4c25028-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jjkrt\" (UID: \"4bf2ad48-6696-4f08-adc8-330fd4c25028\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.342622 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5920bdb-afd9-401e-8f11-108a90660e1c-metrics-tls\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.342871 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f69ad03d-7d61-4b31-a556-325751fcba8e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kx6gv\" (UID: \"f69ad03d-7d61-4b31-a556-325751fcba8e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.345215 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-tls\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.348913 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/829eb540-5f77-4748-a99d-c5bdbd13c26f-signing-key\") pod \"service-ca-9c57cc56f-tjxkj\" (UID: \"829eb540-5f77-4748-a99d-c5bdbd13c26f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.356887 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7a3637cc-cfef-446c-b0fb-f37f3396e0d7-node-bootstrap-token\") pod \"machine-config-server-2798g\" (UID: \"7a3637cc-cfef-446c-b0fb-f37f3396e0d7\") " pod="openshift-machine-config-operator/machine-config-server-2798g" Feb 19 09:47:13 crc kubenswrapper[4873]: W0219 09:47:13.357073 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5288b888_1b48_4590_8d10_f3688ba87a41.slice/crio-f764265afa3ab2c3c644630fe40a06105e4fbe716760c85c00682cc829d0c1c4 WatchSource:0}: Error finding container f764265afa3ab2c3c644630fe40a06105e4fbe716760c85c00682cc829d0c1c4: Status 404 returned error can't find the container with id f764265afa3ab2c3c644630fe40a06105e4fbe716760c85c00682cc829d0c1c4 Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.357093 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lkp4m"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.357740 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpfg6\" (UniqueName: \"kubernetes.io/projected/829eb540-5f77-4748-a99d-c5bdbd13c26f-kube-api-access-gpfg6\") pod \"service-ca-9c57cc56f-tjxkj\" (UID: \"829eb540-5f77-4748-a99d-c5bdbd13c26f\") " pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.373316 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sstl4\" (UniqueName: \"kubernetes.io/projected/bf2de5cd-4280-4c0c-9276-b693a51986b7-kube-api-access-sstl4\") pod \"ingress-canary-p97g8\" (UID: \"bf2de5cd-4280-4c0c-9276-b693a51986b7\") " pod="openshift-ingress-canary/ingress-canary-p97g8" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.386096 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.389274 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5920bdb-afd9-401e-8f11-108a90660e1c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.405155 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.412544 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.414273 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e3dce33-cc6d-41b5-ac17-481a98c06373-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.422873 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t257b\" (UniqueName: \"kubernetes.io/projected/176cb3ad-1201-420f-bdb2-586f974aeaf2-kube-api-access-t257b\") pod \"dns-default-mv87q\" (UID: \"176cb3ad-1201-420f-bdb2-586f974aeaf2\") " pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.422917 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfjv2\" (UniqueName: \"kubernetes.io/projected/2877ec4c-7a3e-4105-ac87-6d096df10661-kube-api-access-qfjv2\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.422943 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-socket-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.422983 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-plugins-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.423021 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-mountpoint-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.423035 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/176cb3ad-1201-420f-bdb2-586f974aeaf2-config-volume\") pod \"dns-default-mv87q\" (UID: \"176cb3ad-1201-420f-bdb2-586f974aeaf2\") " pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.423139 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-registration-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.423180 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-csi-data-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.423212 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/176cb3ad-1201-420f-bdb2-586f974aeaf2-metrics-tls\") pod \"dns-default-mv87q\" (UID: \"176cb3ad-1201-420f-bdb2-586f974aeaf2\") " pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.423234 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: E0219 09:47:13.423536 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:13.923524693 +0000 UTC m=+143.212956331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.424074 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-socket-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.424133 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-plugins-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.424165 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-mountpoint-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.424706 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-csi-data-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.424776 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2877ec4c-7a3e-4105-ac87-6d096df10661-registration-dir\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.425571 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/176cb3ad-1201-420f-bdb2-586f974aeaf2-config-volume\") pod \"dns-default-mv87q\" (UID: \"176cb3ad-1201-420f-bdb2-586f974aeaf2\") " pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.428087 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/176cb3ad-1201-420f-bdb2-586f974aeaf2-metrics-tls\") pod \"dns-default-mv87q\" (UID: \"176cb3ad-1201-420f-bdb2-586f974aeaf2\") " pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.429167 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t66n7\" (UniqueName: \"kubernetes.io/projected/f905b5ea-71df-4b1c-997c-d68766bcfcfe-kube-api-access-t66n7\") pod \"marketplace-operator-79b997595-86hhq\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.454322 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm7zn\" (UniqueName: \"kubernetes.io/projected/c5e97ddb-b404-4ce2-b760-2739c36c755a-kube-api-access-tm7zn\") pod \"packageserver-d55dfcdfc-rpsnj\" (UID: \"c5e97ddb-b404-4ce2-b760-2739c36c755a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.464562 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p97g8" Feb 19 09:47:13 crc kubenswrapper[4873]: W0219 09:47:13.471947 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12ef881d_885a_4215_bd57_27966cb209b8.slice/crio-5fd5fa13184676d249c8368aa4595a57293f54ca5e72b3279b3a50e064929529 WatchSource:0}: Error finding container 5fd5fa13184676d249c8368aa4595a57293f54ca5e72b3279b3a50e064929529: Status 404 returned error can't find the container with id 5fd5fa13184676d249c8368aa4595a57293f54ca5e72b3279b3a50e064929529 Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.474215 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c6tw\" (UniqueName: \"kubernetes.io/projected/7671d99c-f025-4e36-b336-106655ec13ef-kube-api-access-5c6tw\") pod \"dns-operator-744455d44c-qmrn5\" (UID: \"7671d99c-f025-4e36-b336-106655ec13ef\") " pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.499305 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4xtz\" (UniqueName: \"kubernetes.io/projected/a5920bdb-afd9-401e-8f11-108a90660e1c-kube-api-access-p4xtz\") pod \"ingress-operator-5b745b69d9-mwl9k\" (UID: \"a5920bdb-afd9-401e-8f11-108a90660e1c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.507923 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4fmb\" (UniqueName: \"kubernetes.io/projected/7a3637cc-cfef-446c-b0fb-f37f3396e0d7-kube-api-access-b4fmb\") pod \"machine-config-server-2798g\" (UID: \"7a3637cc-cfef-446c-b0fb-f37f3396e0d7\") " pod="openshift-machine-config-operator/machine-config-server-2798g" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.514758 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.514785 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.523684 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:13 crc kubenswrapper[4873]: E0219 09:47:13.524130 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.024090776 +0000 UTC m=+143.313522404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.525299 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f69ad03d-7d61-4b31-a556-325751fcba8e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kx6gv\" (UID: \"f69ad03d-7d61-4b31-a556-325751fcba8e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.546255 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slpss\" (UniqueName: \"kubernetes.io/projected/40382b72-88a7-4f37-9192-a555a259d4bd-kube-api-access-slpss\") pod \"etcd-operator-b45778765-vklwp\" (UID: \"40382b72-88a7-4f37-9192-a555a259d4bd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: W0219 09:47:13.553079 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34f3caca_1b4c_493d_a10b_277b42d7ce72.slice/crio-ae45da8c938f4f64005c9bf6e462b1759bbbeee8b1b60b03c29e259f6c2bb43b WatchSource:0}: Error finding container ae45da8c938f4f64005c9bf6e462b1759bbbeee8b1b60b03c29e259f6c2bb43b: Status 404 returned error can't find the container with id ae45da8c938f4f64005c9bf6e462b1759bbbeee8b1b60b03c29e259f6c2bb43b Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.566301 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-bound-sa-token\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.586707 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.613998 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-478rd\" (UniqueName: \"kubernetes.io/projected/5b283da7-d736-4ac2-a290-e142728e838a-kube-api-access-478rd\") pod \"olm-operator-6b444d44fb-2b5f5\" (UID: \"5b283da7-d736-4ac2-a290-e142728e838a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.624977 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.625308 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8trrg\" (UniqueName: \"kubernetes.io/projected/2e3dce33-cc6d-41b5-ac17-481a98c06373-kube-api-access-8trrg\") pod \"cluster-image-registry-operator-dc59b4c8b-8cxf7\" (UID: \"2e3dce33-cc6d-41b5-ac17-481a98c06373\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: E0219 09:47:13.626555 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.126538912 +0000 UTC m=+143.415970560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.663229 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8h4w\" (UniqueName: \"kubernetes.io/projected/4bf2ad48-6696-4f08-adc8-330fd4c25028-kube-api-access-c8h4w\") pod \"package-server-manager-789f6589d5-jjkrt\" (UID: \"4bf2ad48-6696-4f08-adc8-330fd4c25028\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.663572 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.664136 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.669304 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.679599 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.680229 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7wm6\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-kube-api-access-q7wm6\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.692838 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.695207 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbm8\" (UniqueName: \"kubernetes.io/projected/21aad9a0-00de-4f42-9923-6c66c79a3a8d-kube-api-access-xdbm8\") pod \"machine-config-controller-84d6567774-24gcv\" (UID: \"21aad9a0-00de-4f42-9923-6c66c79a3a8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.696230 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw"] Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.698044 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.710886 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t257b\" (UniqueName: \"kubernetes.io/projected/176cb3ad-1201-420f-bdb2-586f974aeaf2-kube-api-access-t257b\") pod \"dns-default-mv87q\" (UID: \"176cb3ad-1201-420f-bdb2-586f974aeaf2\") " pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.711672 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.717516 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.727369 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.727817 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:13 crc kubenswrapper[4873]: E0219 09:47:13.727981 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.227960339 +0000 UTC m=+143.517392037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.743225 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfjv2\" (UniqueName: \"kubernetes.io/projected/2877ec4c-7a3e-4105-ac87-6d096df10661-kube-api-access-qfjv2\") pod \"csi-hostpathplugin-d75st\" (UID: \"2877ec4c-7a3e-4105-ac87-6d096df10661\") " pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.749481 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2798g" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.753353 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.803568 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.804376 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-d75st" Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.848192 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: E0219 09:47:13.848617 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.348601669 +0000 UTC m=+143.638033317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.950431 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:13 crc kubenswrapper[4873]: E0219 09:47:13.950694 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.450668015 +0000 UTC m=+143.740099663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:13 crc kubenswrapper[4873]: I0219 09:47:13.950846 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:13 crc kubenswrapper[4873]: E0219 09:47:13.952071 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.452059185 +0000 UTC m=+143.741490833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:13.996539 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.002364 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg"] Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.010902 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k"] Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.042697 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7gp2" podStartSLOduration=117.042677051 podStartE2EDuration="1m57.042677051s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:14.022363327 +0000 UTC m=+143.311794965" watchObservedRunningTime="2026-02-19 09:47:14.042677051 +0000 UTC m=+143.332108709" Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.044914 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw"] Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.052536 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.052952 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.552924876 +0000 UTC m=+143.842356514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.086932 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9pq25"] Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.087895 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr"] Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.153734 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.154073 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.654062435 +0000 UTC m=+143.943494073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.186829 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tjxkj"] Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.211041 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p97g8"] Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.255436 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.255655 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.755614766 +0000 UTC m=+144.045046434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.255798 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.256631 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.756602784 +0000 UTC m=+144.046034422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.264612 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv"] Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.325705 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7"] Feb 19 09:47:14 crc kubenswrapper[4873]: W0219 09:47:14.328228 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57d54c43_611a_40f1_b05e_9a0007dbe3ec.slice/crio-1ef6994b71107d589c7222906c9783e6710458ddf932d1bdcaf1c6580c2ab010 WatchSource:0}: Error finding container 1ef6994b71107d589c7222906c9783e6710458ddf932d1bdcaf1c6580c2ab010: Status 404 returned error can't find the container with id 1ef6994b71107d589c7222906c9783e6710458ddf932d1bdcaf1c6580c2ab010 Feb 19 09:47:14 crc kubenswrapper[4873]: W0219 09:47:14.347695 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0f6a9a1_70e5_46ce_97aa_3dc9d26c672e.slice/crio-0daa5ed7cee4ce149fc32b28c1540ff717542388480b099848cac564d411d3c5 WatchSource:0}: Error finding container 0daa5ed7cee4ce149fc32b28c1540ff717542388480b099848cac564d411d3c5: Status 404 returned error can't find the container with id 0daa5ed7cee4ce149fc32b28c1540ff717542388480b099848cac564d411d3c5 Feb 19 09:47:14 crc kubenswrapper[4873]: W0219 09:47:14.350966 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06e4a751_614f_49d2_8246_c76419d1ccb4.slice/crio-7f930d77d6c1b794c971625c5385cd1fdd036f11c2a23b5d63fabb2ad61b5233 WatchSource:0}: Error finding container 7f930d77d6c1b794c971625c5385cd1fdd036f11c2a23b5d63fabb2ad61b5233: Status 404 returned error can't find the container with id 7f930d77d6c1b794c971625c5385cd1fdd036f11c2a23b5d63fabb2ad61b5233 Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.357710 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.358639 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.858622708 +0000 UTC m=+144.148054346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.357744 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" event={"ID":"5198f9e2-ae27-4804-ab74-0759a5217d89","Type":"ContainerStarted","Data":"cd05e369e7782989017ff1d32b735cd57d5b43295d78b3f652974d003ba29fb5"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.370608 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" event={"ID":"df659e7d-39ab-41ee-8df5-08896976666c","Type":"ContainerStarted","Data":"97010ed8078531daee035bd37dff34f975c08ac9590e5e55ac162c150e964363"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.370653 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" event={"ID":"df659e7d-39ab-41ee-8df5-08896976666c","Type":"ContainerStarted","Data":"99a5de09374b447b2a6cec4a863f8114702137a0313f0efe6a40ce78a9023ce6"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.385808 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" event={"ID":"3c4f7134-312f-4f1d-a344-80d44d65c371","Type":"ContainerStarted","Data":"0bc85a48940755ef83d2dac6f27402777234d5037699b4a2566555cf4a968d32"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.387595 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k" event={"ID":"57d54c43-611a-40f1-b05e-9a0007dbe3ec","Type":"ContainerStarted","Data":"1ef6994b71107d589c7222906c9783e6710458ddf932d1bdcaf1c6580c2ab010"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.395562 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" event={"ID":"79bb3a49-346f-49b7-bb8e-c358105f8035","Type":"ContainerStarted","Data":"c802d62e0598cf4480ef9c8f5a2bddefb86ea0c808ed027dd26a5ef64907d8b5"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.398170 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj"] Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.400157 4873 csr.go:261] certificate signing request csr-ckc9p is approved, waiting to be issued Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.403806 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" event={"ID":"9e9b2e26-976d-498c-88d8-dbddd520c9bf","Type":"ContainerStarted","Data":"124dc5c9ec08792dda6c444f4fb30401ef2bc5d4dac9d8ec3bd082febd153f0a"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.406415 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" event={"ID":"bb18a52e-b1db-406b-a2e8-88a1ae8b05fc","Type":"ContainerStarted","Data":"73d8bf9e9107a32db28a8a87ed6db234be57692c177e07c6e50d5e5fc93dd98b"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.409343 4873 csr.go:257] certificate signing request csr-ckc9p is issued Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.412551 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" event={"ID":"12ef881d-885a-4215-bd57-27966cb209b8","Type":"ContainerStarted","Data":"5fd5fa13184676d249c8368aa4595a57293f54ca5e72b3279b3a50e064929529"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.416359 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" event={"ID":"6f60efd0-54f5-43eb-b824-f8eaa836df60","Type":"ContainerStarted","Data":"1e01e7bdb7bbbc36454cea754639a33a0dfeec0a18d5a42fedc4ef2e29d29e2a"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.422254 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" event={"ID":"5288b888-1b48-4590-8d10-f3688ba87a41","Type":"ContainerStarted","Data":"f764265afa3ab2c3c644630fe40a06105e4fbe716760c85c00682cc829d0c1c4"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.429466 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shnwj" event={"ID":"10aa25f4-7549-468a-b42f-19305ad066dd","Type":"ContainerStarted","Data":"7a581424f0da8ea44b76eb3be0d323e922f9fdfbe4bef5b6c66bc43929d92666"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.435992 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" event={"ID":"ad63def1-18c4-4841-a936-b7c7e42ce092","Type":"ContainerStarted","Data":"22633f6be15e3066ccd2857d6ceca2e0f25fb8362c2866a0b4c282b81d4d81a2"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.442840 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.444078 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" event={"ID":"de77b9aa-b558-4431-b116-5e1e1cc116f3","Type":"ContainerStarted","Data":"a018522e013b75a19d6f1ebe089ac24d73537d03912cdb27eb2e286e6cfe33f1"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.447178 4873 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4g545 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.447236 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" podUID="9324aa8b-fbce-42bb-b339-0aa2e382efd4" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.447616 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" event={"ID":"4b881e81-67ed-4c33-a992-da59d7996b9d","Type":"ContainerStarted","Data":"36bec1813ef9454858204464e136a511460f50dc9a16eee6fd55fe827e0740ca"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.459367 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.473303 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" event={"ID":"595c8db4-733e-4729-aa34-8be7307043a8","Type":"ContainerStarted","Data":"6baff13dd3bc0e3cfc61c01c263ffef22ede179c3b6f7ae6ac471e48b8576db9"} Feb 19 09:47:14 crc kubenswrapper[4873]: W0219 09:47:14.475377 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e3dce33_cc6d_41b5_ac17_481a98c06373.slice/crio-efe28b822db8d99d38099b70f217108261222e01289341596d0f1c9c87197060 WatchSource:0}: Error finding container efe28b822db8d99d38099b70f217108261222e01289341596d0f1c9c87197060: Status 404 returned error can't find the container with id efe28b822db8d99d38099b70f217108261222e01289341596d0f1c9c87197060 Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.484542 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:14.984518779 +0000 UTC m=+144.273950417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.526706 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb" event={"ID":"d639ff25-343e-4e7c-bd2e-f5fc533923f4","Type":"ContainerStarted","Data":"c8a7034bee2eb267f1a3c1b5d1b92b9d227d88fb49542da1a2989e40d3218146"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.547379 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dxcz7" event={"ID":"b2d87932-1993-464d-b3d2-71025526e1f2","Type":"ContainerStarted","Data":"52c8c7565f7e76e2b1be9ed7ec38817313b26529cc4216d67909c7d5b38aadf8"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.548194 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.555775 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" event={"ID":"c8c1d3a6-23fd-4526-8892-0add23b09a9a","Type":"ContainerStarted","Data":"ca0ba083f2d897c6b2f519cbc9b73b7e76a6575165553e074c67d17692757e96"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.556643 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.560009 4873 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-qvxgz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.560058 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" podUID="c8c1d3a6-23fd-4526-8892-0add23b09a9a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.560728 4873 patch_prober.go:28] interesting pod/console-operator-58897d9998-dxcz7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.560770 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-dxcz7" podUID="b2d87932-1993-464d-b3d2-71025526e1f2" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.562318 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.563477 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.06345896 +0000 UTC m=+144.352890598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.574129 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kzpbf" event={"ID":"34f3caca-1b4c-493d-a10b-277b42d7ce72","Type":"ContainerStarted","Data":"ae45da8c938f4f64005c9bf6e462b1759bbbeee8b1b60b03c29e259f6c2bb43b"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.576864 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" event={"ID":"bd468f98-7720-4f9a-972f-684b96f4f90f","Type":"ContainerStarted","Data":"00d137182546ceb731d1231ff4489ff44e56001f5469f15e0d3bd78dd28af61d"} Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.577213 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.585343 4873 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qltqp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.585398 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" podUID="bd468f98-7720-4f9a-972f-684b96f4f90f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.648703 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k"] Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.663848 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.665631 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.165612298 +0000 UTC m=+144.455043936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.766768 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.766961 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.266935952 +0000 UTC m=+144.556367590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.767023 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.767405 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.267392845 +0000 UTC m=+144.556824483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.867994 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.868365 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.368341369 +0000 UTC m=+144.657773007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.868464 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.868811 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.368798242 +0000 UTC m=+144.658229880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.969183 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.969332 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.469304941 +0000 UTC m=+144.758736569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:14 crc kubenswrapper[4873]: I0219 09:47:14.969477 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:14 crc kubenswrapper[4873]: E0219 09:47:14.969766 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.469753034 +0000 UTC m=+144.759184672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.072748 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:15 crc kubenswrapper[4873]: E0219 09:47:15.072962 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.572924672 +0000 UTC m=+144.862356310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.076467 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:15 crc kubenswrapper[4873]: E0219 09:47:15.076980 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.576965858 +0000 UTC m=+144.866397496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.127011 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-86hhq"] Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.153164 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mv87q"] Feb 19 09:47:15 crc kubenswrapper[4873]: W0219 09:47:15.182636 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf905b5ea_71df_4b1c_997c_d68766bcfcfe.slice/crio-019acffae30ee36980fd8260d8a8299738a95c80eb08007a6c0478560261a038 WatchSource:0}: Error finding container 019acffae30ee36980fd8260d8a8299738a95c80eb08007a6c0478560261a038: Status 404 returned error can't find the container with id 019acffae30ee36980fd8260d8a8299738a95c80eb08007a6c0478560261a038 Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.183805 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:15 crc kubenswrapper[4873]: E0219 09:47:15.184260 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.684245564 +0000 UTC m=+144.973677202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.185346 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vklwp"] Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.272536 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qmrn5"] Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.277488 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-d75st"] Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.290141 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:15 crc kubenswrapper[4873]: E0219 09:47:15.290447 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.790432458 +0000 UTC m=+145.079864086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:15 crc kubenswrapper[4873]: W0219 09:47:15.290755 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40382b72_88a7_4f37_9192_a555a259d4bd.slice/crio-6345f8656fe0ece4ffd3ff39617df97541f890972726246e1b10a7ec73f814ea WatchSource:0}: Error finding container 6345f8656fe0ece4ffd3ff39617df97541f890972726246e1b10a7ec73f814ea: Status 404 returned error can't find the container with id 6345f8656fe0ece4ffd3ff39617df97541f890972726246e1b10a7ec73f814ea Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.292766 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt"] Feb 19 09:47:15 crc kubenswrapper[4873]: W0219 09:47:15.301601 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2877ec4c_7a3e_4105_ac87_6d096df10661.slice/crio-859ef3d4364f37b532f02704ca10a3e89ecac979d886e1164e57fb08f4febc07 WatchSource:0}: Error finding container 859ef3d4364f37b532f02704ca10a3e89ecac979d886e1164e57fb08f4febc07: Status 404 returned error can't find the container with id 859ef3d4364f37b532f02704ca10a3e89ecac979d886e1164e57fb08f4febc07 Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.346257 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2kgbd" podStartSLOduration=118.346241353 podStartE2EDuration="1m58.346241353s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:15.34267827 +0000 UTC m=+144.632109908" watchObservedRunningTime="2026-02-19 09:47:15.346241353 +0000 UTC m=+144.635672991" Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.366944 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jt5wx" podStartSLOduration=118.366927548 podStartE2EDuration="1m58.366927548s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:15.362369887 +0000 UTC m=+144.651801545" watchObservedRunningTime="2026-02-19 09:47:15.366927548 +0000 UTC m=+144.656359186" Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.393653 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:15 crc kubenswrapper[4873]: E0219 09:47:15.394002 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.893984366 +0000 UTC m=+145.183416004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.411966 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-19 09:42:14 +0000 UTC, rotation deadline is 2026-12-07 09:21:22.756176769 +0000 UTC Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.412031 4873 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6983h34m7.344149114s for next certificate rotation Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.452739 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5"] Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.453745 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jsc24" podStartSLOduration=118.453736525 podStartE2EDuration="1m58.453736525s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:15.44626747 +0000 UTC m=+144.735699108" watchObservedRunningTime="2026-02-19 09:47:15.453736525 +0000 UTC m=+144.743168163" Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.463645 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv"] Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.482934 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-dxcz7" podStartSLOduration=118.482918014 podStartE2EDuration="1m58.482918014s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:15.482174542 +0000 UTC m=+144.771606190" watchObservedRunningTime="2026-02-19 09:47:15.482918014 +0000 UTC m=+144.772349652" Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.495445 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:15 crc kubenswrapper[4873]: E0219 09:47:15.495860 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:15.995838045 +0000 UTC m=+145.285269673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.529160 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-shnwj" podStartSLOduration=118.529144883 podStartE2EDuration="1m58.529144883s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:15.52763081 +0000 UTC m=+144.817062458" watchObservedRunningTime="2026-02-19 09:47:15.529144883 +0000 UTC m=+144.818576521" Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.563788 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" podStartSLOduration=118.563752319 podStartE2EDuration="1m58.563752319s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:15.561400991 +0000 UTC m=+144.850832629" watchObservedRunningTime="2026-02-19 09:47:15.563752319 +0000 UTC m=+144.853183967" Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.597450 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:15 crc kubenswrapper[4873]: E0219 09:47:15.597831 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:16.097816379 +0000 UTC m=+145.387248017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.691272 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" podStartSLOduration=118.691254146 podStartE2EDuration="1m58.691254146s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:15.689170616 +0000 UTC m=+144.978602254" watchObservedRunningTime="2026-02-19 09:47:15.691254146 +0000 UTC m=+144.980685784" Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.699389 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:15 crc kubenswrapper[4873]: E0219 09:47:15.699851 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:16.199835383 +0000 UTC m=+145.489267021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.719367 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" event={"ID":"12ef881d-885a-4215-bd57-27966cb209b8","Type":"ContainerStarted","Data":"458a152c636f20af08d4a1f9dd12949e038e645a9c738e6ffd5c5d62d089b63d"} Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.719408 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" event={"ID":"6f60efd0-54f5-43eb-b824-f8eaa836df60","Type":"ContainerStarted","Data":"4bb2a7cc8777b708c0b821e8f268c52655ec82f1a20a7fe49cae64f31ea580f5"} Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.719425 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2798g" event={"ID":"7a3637cc-cfef-446c-b0fb-f37f3396e0d7","Type":"ContainerStarted","Data":"1e0279da9306def009f2ea0d536091c5b2bdb0ca796b642f04f1a0af27a1ade2"} Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.722614 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shnwj" event={"ID":"10aa25f4-7549-468a-b42f-19305ad066dd","Type":"ContainerStarted","Data":"cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381"} Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.739335 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" podStartSLOduration=118.739318498 podStartE2EDuration="1m58.739318498s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:15.734342645 +0000 UTC m=+145.023774283" watchObservedRunningTime="2026-02-19 09:47:15.739318498 +0000 UTC m=+145.028750136" Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.739753 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" event={"ID":"a5920bdb-afd9-401e-8f11-108a90660e1c","Type":"ContainerStarted","Data":"806f061284ee7c34be2928a152897dad977e6754cdfd324c860c76b9f536ce51"} Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.800059 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:15 crc kubenswrapper[4873]: E0219 09:47:15.803752 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:16.303731851 +0000 UTC m=+145.593163479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.820564 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" event={"ID":"40382b72-88a7-4f37-9192-a555a259d4bd","Type":"ContainerStarted","Data":"6345f8656fe0ece4ffd3ff39617df97541f890972726246e1b10a7ec73f814ea"} Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.897935 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" event={"ID":"2e3dce33-cc6d-41b5-ac17-481a98c06373","Type":"ContainerStarted","Data":"efe28b822db8d99d38099b70f217108261222e01289341596d0f1c9c87197060"} Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.903841 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:15 crc kubenswrapper[4873]: E0219 09:47:15.904158 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:16.404144709 +0000 UTC m=+145.693576347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.937656 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k" event={"ID":"57d54c43-611a-40f1-b05e-9a0007dbe3ec","Type":"ContainerStarted","Data":"376202c24805e4276aabe8a4ebb3fff982636b00eaa4f2e0501988b21a7d953f"} Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.959869 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" event={"ID":"ad63def1-18c4-4841-a936-b7c7e42ce092","Type":"ContainerStarted","Data":"65823cb2edab02571e8dd813d4387207d4d48977582b49633e0fcb68fdcf2590"} Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.966264 4873 generic.go:334] "Generic (PLEG): container finished" podID="9e9b2e26-976d-498c-88d8-dbddd520c9bf" containerID="124dc5c9ec08792dda6c444f4fb30401ef2bc5d4dac9d8ec3bd082febd153f0a" exitCode=0 Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.966325 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" event={"ID":"9e9b2e26-976d-498c-88d8-dbddd520c9bf","Type":"ContainerDied","Data":"124dc5c9ec08792dda6c444f4fb30401ef2bc5d4dac9d8ec3bd082febd153f0a"} Feb 19 09:47:15 crc kubenswrapper[4873]: I0219 09:47:15.966349 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" event={"ID":"9e9b2e26-976d-498c-88d8-dbddd520c9bf","Type":"ContainerStarted","Data":"516187f883da9b7d629e6b7b4719d3c614170b0d549730a990ec738141a2a94b"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.002872 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wfq9w" podStartSLOduration=119.002853178 podStartE2EDuration="1m59.002853178s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:15.999006857 +0000 UTC m=+145.288438495" watchObservedRunningTime="2026-02-19 09:47:16.002853178 +0000 UTC m=+145.292284816" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.004321 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.005466 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:16.505449353 +0000 UTC m=+145.794880991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.019541 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" event={"ID":"9324aa8b-fbce-42bb-b339-0aa2e382efd4","Type":"ContainerStarted","Data":"97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.032753 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d75st" event={"ID":"2877ec4c-7a3e-4105-ac87-6d096df10661","Type":"ContainerStarted","Data":"859ef3d4364f37b532f02704ca10a3e89ecac979d886e1164e57fb08f4febc07"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.041527 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" event={"ID":"48911b55-fb42-412b-9298-4cba1105a164","Type":"ContainerStarted","Data":"002c07ff8b0050c7c6c05249b1dd992e03979ec683583e4b27531e24bb5b2562"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.054145 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" event={"ID":"de77b9aa-b558-4431-b116-5e1e1cc116f3","Type":"ContainerStarted","Data":"e60bc2f916aff75454f8db4d5b15c6ae005baebfdcb79c0c87df06d3a9db5142"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.076619 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb" event={"ID":"d639ff25-343e-4e7c-bd2e-f5fc533923f4","Type":"ContainerStarted","Data":"75a0498ae11dea31201abc8dfdf3eb229ec353a4ecf145b8528f1fd23ab07a4e"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.091151 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" podStartSLOduration=119.091133197 podStartE2EDuration="1m59.091133197s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.075332943 +0000 UTC m=+145.364764581" watchObservedRunningTime="2026-02-19 09:47:16.091133197 +0000 UTC m=+145.380564835" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.093370 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" event={"ID":"4bf2ad48-6696-4f08-adc8-330fd4c25028","Type":"ContainerStarted","Data":"ab261c5d820be732f26ee439fe66ecdaa1b4d0ed012e3d18ace9fbd2e4de6b5a"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.105650 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.109264 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:16.609244538 +0000 UTC m=+145.898676286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.121511 4873 generic.go:334] "Generic (PLEG): container finished" podID="5968ec26-dea6-4e79-99b1-5954e173d226" containerID="610149f7289ffe0e19460275fe847c040900eefe73d678ea30ef2cab6c0695df" exitCode=0 Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.121593 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" event={"ID":"5968ec26-dea6-4e79-99b1-5954e173d226","Type":"ContainerDied","Data":"610149f7289ffe0e19460275fe847c040900eefe73d678ea30ef2cab6c0695df"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.123401 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" event={"ID":"c5e97ddb-b404-4ce2-b760-2739c36c755a","Type":"ContainerStarted","Data":"0fac0c7f9c572d1fe7c43463ff755f4edcdf7517898f1ac1c0bffad4086649e5"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.124209 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.134724 4873 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rpsnj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" start-of-body= Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.134773 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" podUID="c5e97ddb-b404-4ce2-b760-2739c36c755a" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.135755 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" event={"ID":"7671d99c-f025-4e36-b336-106655ec13ef","Type":"ContainerStarted","Data":"51d7fa54d22d6d20f4df5008e8bc0c33ffeb62a7704a6f583a20f644788356c3"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.161479 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" event={"ID":"f69ad03d-7d61-4b31-a556-325751fcba8e","Type":"ContainerStarted","Data":"af0ed8263d6c2f7d43d615058f9b59d5e06528f281061126ed76bc142d45ab55"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.181984 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" event={"ID":"bd468f98-7720-4f9a-972f-684b96f4f90f","Type":"ContainerStarted","Data":"444f0cf49cee0a425b45e35836ef80b67c64300d173966d869b2bf6f32c4f2d2"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.200080 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" event={"ID":"06e4a751-614f-49d2-8246-c76419d1ccb4","Type":"ContainerStarted","Data":"19ed82ac93e13dbda08f17685ab76cf45a36618aa1b5cdd0f6ed8debf558ed56"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.200140 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" event={"ID":"06e4a751-614f-49d2-8246-c76419d1ccb4","Type":"ContainerStarted","Data":"7f930d77d6c1b794c971625c5385cd1fdd036f11c2a23b5d63fabb2ad61b5233"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.201772 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.202238 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.205598 4873 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dg6jw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.205748 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" podUID="06e4a751-614f-49d2-8246-c76419d1ccb4" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.206448 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.207707 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:16.707690749 +0000 UTC m=+145.997122387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.232421 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9pq25" event={"ID":"e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e","Type":"ContainerStarted","Data":"fe3c9d0e10d4bd754ef0e8d8f7d4d741dab93ae4e01beee7ad2c4e1fb5e7655c"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.232459 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9pq25" event={"ID":"e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e","Type":"ContainerStarted","Data":"0daa5ed7cee4ce149fc32b28c1540ff717542388480b099848cac564d411d3c5"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.233290 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9pq25" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.244174 4873 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pq25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.244280 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pq25" podUID="e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.245961 4873 generic.go:334] "Generic (PLEG): container finished" podID="3c4f7134-312f-4f1d-a344-80d44d65c371" containerID="1bb88329deecbdc4f73b8c08c936a5a9b759033861f38d2ad6d0f8a5bfa48630" exitCode=0 Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.246071 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" event={"ID":"3c4f7134-312f-4f1d-a344-80d44d65c371","Type":"ContainerDied","Data":"1bb88329deecbdc4f73b8c08c936a5a9b759033861f38d2ad6d0f8a5bfa48630"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.271645 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" event={"ID":"829eb540-5f77-4748-a99d-c5bdbd13c26f","Type":"ContainerStarted","Data":"b45796554170d7263870c3f2f726d814086aaeea96971eab4c58f1dc507e6dd8"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.296952 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" event={"ID":"4b881e81-67ed-4c33-a992-da59d7996b9d","Type":"ContainerStarted","Data":"3e50b8b153bf2d7257ba5ed2d4d11a2070ac6405ee8e2a20e920dbe2fa585782"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.318179 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.318705 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:16.818690622 +0000 UTC m=+146.108122260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.329677 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dxcz7" event={"ID":"b2d87932-1993-464d-b3d2-71025526e1f2","Type":"ContainerStarted","Data":"e9b65aa400ac589e45085a45d51a94b9442ac42995dbdbdf2ff5aed61ba87bd4"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.336382 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" podStartSLOduration=119.33636425 podStartE2EDuration="1m59.33636425s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.279837675 +0000 UTC m=+145.569269313" watchObservedRunningTime="2026-02-19 09:47:16.33636425 +0000 UTC m=+145.625795888" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.372058 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" podStartSLOduration=119.372039446 podStartE2EDuration="1m59.372039446s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.335175176 +0000 UTC m=+145.624606824" watchObservedRunningTime="2026-02-19 09:47:16.372039446 +0000 UTC m=+145.661471084" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.387936 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mv87q" event={"ID":"176cb3ad-1201-420f-bdb2-586f974aeaf2","Type":"ContainerStarted","Data":"d8d905c4fcc5875c73c675b7e5a5b70379e42bdf043c48cb30f8359b1f025c27"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.419535 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.420489 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" event={"ID":"79bb3a49-346f-49b7-bb8e-c358105f8035","Type":"ContainerStarted","Data":"895da616065009c7730ac4dd615ae2f8e29d586af3aa212e7cc0452f645aa23d"} Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.421491 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:16.921470348 +0000 UTC m=+146.210901986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.445532 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" podStartSLOduration=119.44551689 podStartE2EDuration="1m59.44551689s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.408278789 +0000 UTC m=+145.697710427" watchObservedRunningTime="2026-02-19 09:47:16.44551689 +0000 UTC m=+145.734948528" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.501070 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-s67xb" podStartSLOduration=119.501030997 podStartE2EDuration="1m59.501030997s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.444073218 +0000 UTC m=+145.733504856" watchObservedRunningTime="2026-02-19 09:47:16.501030997 +0000 UTC m=+145.790462635" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.503933 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" event={"ID":"f905b5ea-71df-4b1c-997c-d68766bcfcfe","Type":"ContainerStarted","Data":"019acffae30ee36980fd8260d8a8299738a95c80eb08007a6c0478560261a038"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.504424 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.505272 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.513762 4873 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-86hhq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.513829 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" podUID="f905b5ea-71df-4b1c-997c-d68766bcfcfe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.521166 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.522917 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.022894095 +0000 UTC m=+146.312325733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.537521 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kzpbf" event={"ID":"34f3caca-1b4c-493d-a10b-277b42d7ce72","Type":"ContainerStarted","Data":"b309e65f91adb4e60631ce7b7b48f8032fc701bb11775ba8684ecc4fb05b7104"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.541257 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9pq25" podStartSLOduration=119.541232683 podStartE2EDuration="1m59.541232683s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.503819387 +0000 UTC m=+145.793251025" watchObservedRunningTime="2026-02-19 09:47:16.541232683 +0000 UTC m=+145.830664321" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.563571 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" event={"ID":"5288b888-1b48-4590-8d10-f3688ba87a41","Type":"ContainerStarted","Data":"bb50b1fbd6a6070765941c78978a2ace71c0cb9cfcbff51b47f3f5a81c2d5454"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.595192 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" podStartSLOduration=119.595172424 podStartE2EDuration="1m59.595172424s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.542303284 +0000 UTC m=+145.831734922" watchObservedRunningTime="2026-02-19 09:47:16.595172424 +0000 UTC m=+145.884604062" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.608308 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p97g8" event={"ID":"bf2de5cd-4280-4c0c-9276-b693a51986b7","Type":"ContainerStarted","Data":"b00108239b20276ebdb4aea3d0da30d4c829283729a58c6f65febe4073113c18"} Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.623187 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.625733 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.125702352 +0000 UTC m=+146.415133990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.628031 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.628249 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.631269 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.131233581 +0000 UTC m=+146.420665219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.651647 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bxfwb" podStartSLOduration=119.651625858 podStartE2EDuration="1m59.651625858s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.63500622 +0000 UTC m=+145.924437858" watchObservedRunningTime="2026-02-19 09:47:16.651625858 +0000 UTC m=+145.941057496" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.660613 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" podStartSLOduration=119.660593796 podStartE2EDuration="1m59.660593796s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.658345071 +0000 UTC m=+145.947776709" watchObservedRunningTime="2026-02-19 09:47:16.660593796 +0000 UTC m=+145.950025434" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.711400 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f24fn" podStartSLOduration=119.711381687 podStartE2EDuration="1m59.711381687s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.709552694 +0000 UTC m=+145.998984332" watchObservedRunningTime="2026-02-19 09:47:16.711381687 +0000 UTC m=+146.000813325" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.742600 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.745805 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.245783276 +0000 UTC m=+146.535214914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.828475 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" podStartSLOduration=119.828461654 podStartE2EDuration="1m59.828461654s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.827307181 +0000 UTC m=+146.116738819" watchObservedRunningTime="2026-02-19 09:47:16.828461654 +0000 UTC m=+146.117893292" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.853797 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.854086 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.354073911 +0000 UTC m=+146.643505549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.885155 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-dxcz7" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.939862 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-kzpbf" podStartSLOduration=119.939843227 podStartE2EDuration="1m59.939843227s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.884731202 +0000 UTC m=+146.174162830" watchObservedRunningTime="2026-02-19 09:47:16.939843227 +0000 UTC m=+146.229274865" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.940780 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p97g8" podStartSLOduration=6.940772364 podStartE2EDuration="6.940772364s" podCreationTimestamp="2026-02-19 09:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:16.938501649 +0000 UTC m=+146.227933287" watchObservedRunningTime="2026-02-19 09:47:16.940772364 +0000 UTC m=+146.230204002" Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.954262 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.954397 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.454357875 +0000 UTC m=+146.743789513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:16 crc kubenswrapper[4873]: I0219 09:47:16.954867 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:16 crc kubenswrapper[4873]: E0219 09:47:16.955175 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.455164468 +0000 UTC m=+146.744596106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.055641 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:17 crc kubenswrapper[4873]: E0219 09:47:17.056056 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.556036849 +0000 UTC m=+146.845468487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.144305 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-k627b" podStartSLOduration=120.144280077 podStartE2EDuration="2m0.144280077s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:17.067737666 +0000 UTC m=+146.357169304" watchObservedRunningTime="2026-02-19 09:47:17.144280077 +0000 UTC m=+146.433711835" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.160062 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:17 crc kubenswrapper[4873]: E0219 09:47:17.170919 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.670904703 +0000 UTC m=+146.960336341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.275441 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:17 crc kubenswrapper[4873]: E0219 09:47:17.276327 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.776311175 +0000 UTC m=+147.065742813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.276437 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:17 crc kubenswrapper[4873]: E0219 09:47:17.276765 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.776757598 +0000 UTC m=+147.066189236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.340938 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.345255 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:17 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:17 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:17 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.345307 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.379609 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:17 crc kubenswrapper[4873]: E0219 09:47:17.380008 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.879994437 +0000 UTC m=+147.169426075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.485768 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:17 crc kubenswrapper[4873]: E0219 09:47:17.486090 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:17.986078268 +0000 UTC m=+147.275509906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.587047 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:17 crc kubenswrapper[4873]: E0219 09:47:17.587244 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:18.087189996 +0000 UTC m=+147.376621634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.587360 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:17 crc kubenswrapper[4873]: E0219 09:47:17.587658 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:18.087649869 +0000 UTC m=+147.377081507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.624938 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" event={"ID":"f69ad03d-7d61-4b31-a556-325751fcba8e","Type":"ContainerStarted","Data":"fa4571a1c2fe413379ed2f470627ac024dbe416e16701b8dcc3c2fb35ad01141"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.629812 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" event={"ID":"6f60efd0-54f5-43eb-b824-f8eaa836df60","Type":"ContainerStarted","Data":"392ce3888e876dbb9146823d117c34dc5c175049e9bcfec4f3edbea66c9b69ed"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.631543 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mv87q" event={"ID":"176cb3ad-1201-420f-bdb2-586f974aeaf2","Type":"ContainerStarted","Data":"f74f6843c1c49ef8b307afe5321bc88322055abf9a9d42846230996638bf2c65"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.631571 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mv87q" event={"ID":"176cb3ad-1201-420f-bdb2-586f974aeaf2","Type":"ContainerStarted","Data":"7f41265a174b3dc391e4997ed1e70e57d84c83bcecf16779a1a2993039c63413"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.631915 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.638954 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" event={"ID":"40382b72-88a7-4f37-9192-a555a259d4bd","Type":"ContainerStarted","Data":"ccb8fd8db4e255b70df5d9413f28212b56be1ea72e4fff568eb751bdc42dfc01"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.641635 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" event={"ID":"5968ec26-dea6-4e79-99b1-5954e173d226","Type":"ContainerStarted","Data":"2f1c16050d17f4f8ab49a62bd3cb2abaa8f050fa7baa896ed8a68aa0dd07562d"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.641661 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" event={"ID":"5968ec26-dea6-4e79-99b1-5954e173d226","Type":"ContainerStarted","Data":"654bd2e085d743a5b80f67faf8bdf89a529f19e9ca993c07d50c98f152e9aa67"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.645614 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" event={"ID":"f905b5ea-71df-4b1c-997c-d68766bcfcfe","Type":"ContainerStarted","Data":"1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.646394 4873 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-86hhq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.646432 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" podUID="f905b5ea-71df-4b1c-997c-d68766bcfcfe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.650074 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" event={"ID":"5b283da7-d736-4ac2-a290-e142728e838a","Type":"ContainerStarted","Data":"c569c9f64f8efa230aabda049b973c4e298eb4c5e841da7ae5334645e79904a0"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.650123 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" event={"ID":"5b283da7-d736-4ac2-a290-e142728e838a","Type":"ContainerStarted","Data":"c531a0594996749d18410ed12fcc090b589c886b5c87dc00cd63928a8107091d"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.650905 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.652001 4873 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-2b5f5 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.652027 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" podUID="5b283da7-d736-4ac2-a290-e142728e838a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.652611 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lkp4m" event={"ID":"5288b888-1b48-4590-8d10-f3688ba87a41","Type":"ContainerStarted","Data":"03861aebdd7aa4f810bd47019d84f0fbf571701bfa0a6961b9463f32e15fe45a"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.664569 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kx6gv" podStartSLOduration=120.664556521 podStartE2EDuration="2m0.664556521s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:17.662376918 +0000 UTC m=+146.951808556" watchObservedRunningTime="2026-02-19 09:47:17.664556521 +0000 UTC m=+146.953988159" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.677043 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" event={"ID":"21aad9a0-00de-4f42-9923-6c66c79a3a8d","Type":"ContainerStarted","Data":"baaed203c6a1d4bd0f264963f01d1e8ced2bc86f52c8d93ed1609744e131701b"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.677090 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" event={"ID":"21aad9a0-00de-4f42-9923-6c66c79a3a8d","Type":"ContainerStarted","Data":"7f4dc70a8d22023b3b32a54af9716b31d1084b495da67830931010fc1b0bb8f4"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.677117 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" event={"ID":"21aad9a0-00de-4f42-9923-6c66c79a3a8d","Type":"ContainerStarted","Data":"c4b052a5483678d753afdf1aa3785ac25ddbd83ec2a60fa86c3a96264b652428"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.679403 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" event={"ID":"48911b55-fb42-412b-9298-4cba1105a164","Type":"ContainerStarted","Data":"ebfd2ce7872ae12742424aef8df0bbf397b8040075ca34c0cf06f9da9fbe9224"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.684521 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p97g8" event={"ID":"bf2de5cd-4280-4c0c-9276-b693a51986b7","Type":"ContainerStarted","Data":"bb48a06737620a3a4beca3c7b98ccea529eaf53424fb7b0b83507caf6b582baa"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.686524 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" event={"ID":"c5e97ddb-b404-4ce2-b760-2739c36c755a","Type":"ContainerStarted","Data":"ffe2d8ab5cb76d62e3535aa09dfd53d0d6b32bd77ad717ce28b04abed265c514"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.688342 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:17 crc kubenswrapper[4873]: E0219 09:47:17.689870 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:18.189828408 +0000 UTC m=+147.479260046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.714994 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" event={"ID":"2e3dce33-cc6d-41b5-ac17-481a98c06373","Type":"ContainerStarted","Data":"95208c3f7ae6ed0e359d624f2cd6b3ec4a43078b3764df7d20384bfb957edcc1"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.722546 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" event={"ID":"4bf2ad48-6696-4f08-adc8-330fd4c25028","Type":"ContainerStarted","Data":"b9e30abb2510dc566b3639aefe80d5895fa436bf9379a2c609ddc205a82604d5"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.722588 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" event={"ID":"4bf2ad48-6696-4f08-adc8-330fd4c25028","Type":"ContainerStarted","Data":"d28d10ce9f03794e11a23fa90d1fcd426a1fb2fd80ce53230a0b702bec894f6b"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.723178 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.724482 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" event={"ID":"12ef881d-885a-4215-bd57-27966cb209b8","Type":"ContainerStarted","Data":"6de934514d24c0fc178989594dd8ef80750809aa82f25b04f2ef7016dfbda7bd"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.725896 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2798g" event={"ID":"7a3637cc-cfef-446c-b0fb-f37f3396e0d7","Type":"ContainerStarted","Data":"33e766298c4ac8a82b60e485ee3da37b8128c024f19904586311634509db0398"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.738436 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" event={"ID":"3c4f7134-312f-4f1d-a344-80d44d65c371","Type":"ContainerStarted","Data":"5194f241d142d3bedc938db89f1735ed15cb997da623f56b217c30c74cbd31cd"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.738973 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.753619 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k" event={"ID":"57d54c43-611a-40f1-b05e-9a0007dbe3ec","Type":"ContainerStarted","Data":"470243dcfcb35d2f3e0759c613462fd35c7e2f8c39ef376ee7b0a205fdc17b16"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.756633 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jm66x"] Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.784765 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-vklwp" podStartSLOduration=120.784722237 podStartE2EDuration="2m0.784722237s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:17.766450512 +0000 UTC m=+147.055882150" watchObservedRunningTime="2026-02-19 09:47:17.784722237 +0000 UTC m=+147.074153875" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.792930 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:17 crc kubenswrapper[4873]: E0219 09:47:17.793455 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:18.293426068 +0000 UTC m=+147.582857706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.808951 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.809144 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.809166 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tjxkj" event={"ID":"829eb540-5f77-4748-a99d-c5bdbd13c26f","Type":"ContainerStarted","Data":"d5d942e672ae021105b11e63ce43102f38fad5f5cbaeb996e1aaa544f1b300a6"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.809591 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.815542 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jm66x"] Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.816300 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.825957 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" event={"ID":"7671d99c-f025-4e36-b336-106655ec13ef","Type":"ContainerStarted","Data":"6c62cffe8c256d15bc1fb13f0a13cbf87c7055d0df2d95affa4598eec4d2f960"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.826005 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" event={"ID":"7671d99c-f025-4e36-b336-106655ec13ef","Type":"ContainerStarted","Data":"dac674c05caf9ed80f36d48640d046817e30c59e1940a881e81742bf1276623b"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.830180 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" event={"ID":"a5920bdb-afd9-401e-8f11-108a90660e1c","Type":"ContainerStarted","Data":"8fb4855d36dbe8c0caf6f5c5cca86c812b9220cdc618cae25d2cd061754b51ce"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.830212 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" event={"ID":"a5920bdb-afd9-401e-8f11-108a90660e1c","Type":"ContainerStarted","Data":"01ef2b0c84c18ace3347b5260941e99078f586f3684f9c7da48796b5cf1cf72d"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.835013 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.862603 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d75st" event={"ID":"2877ec4c-7a3e-4105-ac87-6d096df10661","Type":"ContainerStarted","Data":"6b5be3c9aaaeafbac09b7a0ad8e8084439b6283e4f97911b7e226afb7fa36061"} Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.863493 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwg5" podStartSLOduration=120.863477193 podStartE2EDuration="2m0.863477193s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:17.862661659 +0000 UTC m=+147.152093297" watchObservedRunningTime="2026-02-19 09:47:17.863477193 +0000 UTC m=+147.152908831" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.864312 4873 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pq25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.864353 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pq25" podUID="e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.883292 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7bgm9" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.901937 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dg6jw" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.902718 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:17 crc kubenswrapper[4873]: E0219 09:47:17.903710 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:18.403674459 +0000 UTC m=+147.693106097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.914304 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" podStartSLOduration=120.914281744 podStartE2EDuration="2m0.914281744s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:17.907314373 +0000 UTC m=+147.196746011" watchObservedRunningTime="2026-02-19 09:47:17.914281744 +0000 UTC m=+147.203713382" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.925652 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.928599 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.932246 4873 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gbzll container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.19:8443/livez\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.932311 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" podUID="5968ec26-dea6-4e79-99b1-5954e173d226" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.19:8443/livez\": dial tcp 10.217.0.19:8443: connect: connection refused" Feb 19 09:47:17 crc kubenswrapper[4873]: I0219 09:47:17.942275 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mv87q" podStartSLOduration=7.942259098 podStartE2EDuration="7.942259098s" podCreationTimestamp="2026-02-19 09:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:17.941894198 +0000 UTC m=+147.231325836" watchObservedRunningTime="2026-02-19 09:47:17.942259098 +0000 UTC m=+147.231690736" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.006169 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:18 crc kubenswrapper[4873]: E0219 09:47:18.008602 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:18.508588106 +0000 UTC m=+147.798019744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.008629 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-catalog-content\") pod \"community-operators-jm66x\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.008742 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-utilities\") pod \"community-operators-jm66x\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.009004 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sl6f\" (UniqueName: \"kubernetes.io/projected/d5d58373-fe5d-4afe-9da1-256843164ff4-kube-api-access-7sl6f\") pod \"community-operators-jm66x\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.021778 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" podStartSLOduration=121.021762445 podStartE2EDuration="2m1.021762445s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:17.977339057 +0000 UTC m=+147.266770705" watchObservedRunningTime="2026-02-19 09:47:18.021762445 +0000 UTC m=+147.311194083" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.022963 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" podStartSLOduration=121.022956469 podStartE2EDuration="2m1.022956469s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:18.019035297 +0000 UTC m=+147.308466935" watchObservedRunningTime="2026-02-19 09:47:18.022956469 +0000 UTC m=+147.312388107" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.056399 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5tt6k" podStartSLOduration=121.056382471 podStartE2EDuration="2m1.056382471s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:18.0549686 +0000 UTC m=+147.344400238" watchObservedRunningTime="2026-02-19 09:47:18.056382471 +0000 UTC m=+147.345814109" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.109684 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.109953 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-catalog-content\") pod \"community-operators-jm66x\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.109996 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-utilities\") pod \"community-operators-jm66x\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.110042 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sl6f\" (UniqueName: \"kubernetes.io/projected/d5d58373-fe5d-4afe-9da1-256843164ff4-kube-api-access-7sl6f\") pod \"community-operators-jm66x\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:47:18 crc kubenswrapper[4873]: E0219 09:47:18.110462 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:18.610447646 +0000 UTC m=+147.899879284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.110819 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-catalog-content\") pod \"community-operators-jm66x\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.111024 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-utilities\") pod \"community-operators-jm66x\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.142303 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8cxf7" podStartSLOduration=121.142287202 podStartE2EDuration="2m1.142287202s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:18.102853757 +0000 UTC m=+147.392285395" watchObservedRunningTime="2026-02-19 09:47:18.142287202 +0000 UTC m=+147.431718840" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.166560 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sl6f\" (UniqueName: \"kubernetes.io/projected/d5d58373-fe5d-4afe-9da1-256843164ff4-kube-api-access-7sl6f\") pod \"community-operators-jm66x\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.185885 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" podStartSLOduration=121.185858875 podStartE2EDuration="2m1.185858875s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:18.177469523 +0000 UTC m=+147.466901161" watchObservedRunningTime="2026-02-19 09:47:18.185858875 +0000 UTC m=+147.475290513" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.195421 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5fj2x"] Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.196779 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.205840 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5fj2x"] Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.211473 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:18 crc kubenswrapper[4873]: E0219 09:47:18.211978 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:18.711967106 +0000 UTC m=+148.001398744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.243686 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.243763 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.289201 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qmrn5" podStartSLOduration=121.289171416 podStartE2EDuration="2m1.289171416s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:18.286716135 +0000 UTC m=+147.576147773" watchObservedRunningTime="2026-02-19 09:47:18.289171416 +0000 UTC m=+147.578603054" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.312646 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:18 crc kubenswrapper[4873]: E0219 09:47:18.312854 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:18.812813806 +0000 UTC m=+148.102245444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.313088 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-catalog-content\") pod \"community-operators-5fj2x\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.313174 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-utilities\") pod \"community-operators-5fj2x\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.313204 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.313236 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc4qh\" (UniqueName: \"kubernetes.io/projected/e52516d8-c410-4dbd-b41f-cbda11425b0e-kube-api-access-hc4qh\") pod \"community-operators-5fj2x\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:47:18 crc kubenswrapper[4873]: E0219 09:47:18.313508 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:18.813500756 +0000 UTC m=+148.102932394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.327722 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hs5fr" podStartSLOduration=121.327702284 podStartE2EDuration="2m1.327702284s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:18.326830189 +0000 UTC m=+147.616261827" watchObservedRunningTime="2026-02-19 09:47:18.327702284 +0000 UTC m=+147.617133922" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.343684 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:18 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:18 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:18 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.343735 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.356840 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-24gcv" podStartSLOduration=121.356824942 podStartE2EDuration="2m1.356824942s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:18.354811994 +0000 UTC m=+147.644243632" watchObservedRunningTime="2026-02-19 09:47:18.356824942 +0000 UTC m=+147.646256580" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.413700 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.414212 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-utilities\") pod \"community-operators-5fj2x\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.414338 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc4qh\" (UniqueName: \"kubernetes.io/projected/e52516d8-c410-4dbd-b41f-cbda11425b0e-kube-api-access-hc4qh\") pod \"community-operators-5fj2x\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.414504 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-catalog-content\") pod \"community-operators-5fj2x\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:47:18 crc kubenswrapper[4873]: E0219 09:47:18.414907 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:18.914890462 +0000 UTC m=+148.204322100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.415474 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-catalog-content\") pod \"community-operators-5fj2x\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.415492 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-utilities\") pod \"community-operators-5fj2x\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.431649 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mwl9k" podStartSLOduration=121.431610053 podStartE2EDuration="2m1.431610053s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:18.42768721 +0000 UTC m=+147.717118848" watchObservedRunningTime="2026-02-19 09:47:18.431610053 +0000 UTC m=+147.721041691" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.435966 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.463642 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc4qh\" (UniqueName: \"kubernetes.io/projected/e52516d8-c410-4dbd-b41f-cbda11425b0e-kube-api-access-hc4qh\") pod \"community-operators-5fj2x\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.514606 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.515704 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:18 crc kubenswrapper[4873]: E0219 09:47:18.516194 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.016168734 +0000 UTC m=+148.305600372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.575955 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-2798g" podStartSLOduration=8.575935123 podStartE2EDuration="8.575935123s" podCreationTimestamp="2026-02-19 09:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:18.477977295 +0000 UTC m=+147.767408933" watchObservedRunningTime="2026-02-19 09:47:18.575935123 +0000 UTC m=+147.865366751" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.578186 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tnf24"] Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.579229 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzspc" podStartSLOduration=121.579217817 podStartE2EDuration="2m1.579217817s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:18.578158047 +0000 UTC m=+147.867589685" watchObservedRunningTime="2026-02-19 09:47:18.579217817 +0000 UTC m=+147.868649455" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.580574 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.615420 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.617883 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:18 crc kubenswrapper[4873]: E0219 09:47:18.618210 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.118194748 +0000 UTC m=+148.407626386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.642358 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tnf24"] Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.693188 4873 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rpsnj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.693248 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" podUID="c5e97ddb-b404-4ce2-b760-2739c36c755a" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.721088 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnz6w\" (UniqueName: \"kubernetes.io/projected/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-kube-api-access-vnz6w\") pod \"certified-operators-tnf24\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.721326 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-utilities\") pod \"certified-operators-tnf24\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.721380 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.721416 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-catalog-content\") pod \"certified-operators-tnf24\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.722124 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8mch8"] Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.722995 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.733256 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mch8"] Feb 19 09:47:18 crc kubenswrapper[4873]: E0219 09:47:18.733648 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.233633498 +0000 UTC m=+148.523065136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.835582 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.835800 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-utilities\") pod \"certified-operators-8mch8\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.835854 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnz6w\" (UniqueName: \"kubernetes.io/projected/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-kube-api-access-vnz6w\") pod \"certified-operators-tnf24\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.835874 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-utilities\") pod \"certified-operators-tnf24\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.835920 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-catalog-content\") pod \"certified-operators-8mch8\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.835940 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq6xh\" (UniqueName: \"kubernetes.io/projected/061e8672-31d8-48ec-87fc-158e44af91e4-kube-api-access-kq6xh\") pod \"certified-operators-8mch8\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.835958 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-catalog-content\") pod \"certified-operators-tnf24\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.836358 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-catalog-content\") pod \"certified-operators-tnf24\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.836555 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-utilities\") pod \"certified-operators-tnf24\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:47:18 crc kubenswrapper[4873]: E0219 09:47:18.836929 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.336909939 +0000 UTC m=+148.626341577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.866047 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnz6w\" (UniqueName: \"kubernetes.io/projected/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-kube-api-access-vnz6w\") pod \"certified-operators-tnf24\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.901146 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d75st" event={"ID":"2877ec4c-7a3e-4105-ac87-6d096df10661","Type":"ContainerStarted","Data":"149b1f49fa4e1ae8aa9adfb073381e6eff152bf0f74ef70ed4423616a7ab8487"} Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.902679 4873 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pq25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.902724 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pq25" podUID="e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.913355 4873 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-86hhq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.913412 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" podUID="f905b5ea-71df-4b1c-997c-d68766bcfcfe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.916404 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2b5f5" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.930504 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.938753 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-utilities\") pod \"certified-operators-8mch8\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.938831 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.938862 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-catalog-content\") pod \"certified-operators-8mch8\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.938880 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq6xh\" (UniqueName: \"kubernetes.io/projected/061e8672-31d8-48ec-87fc-158e44af91e4-kube-api-access-kq6xh\") pod \"certified-operators-8mch8\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:18 crc kubenswrapper[4873]: E0219 09:47:18.939375 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.439362836 +0000 UTC m=+148.728794474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.939692 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-utilities\") pod \"certified-operators-8mch8\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.948352 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-catalog-content\") pod \"certified-operators-8mch8\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:18 crc kubenswrapper[4873]: I0219 09:47:18.989141 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq6xh\" (UniqueName: \"kubernetes.io/projected/061e8672-31d8-48ec-87fc-158e44af91e4-kube-api-access-kq6xh\") pod \"certified-operators-8mch8\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.040719 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.053755 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rpsnj" Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.054265 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.55424546 +0000 UTC m=+148.843677098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.096388 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.144603 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.145317 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.645302149 +0000 UTC m=+148.934733787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.166429 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jm66x"] Feb 19 09:47:19 crc kubenswrapper[4873]: W0219 09:47:19.219608 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5d58373_fe5d_4afe_9da1_256843164ff4.slice/crio-81841dde96f0bd4c162af34dbaad80f9410851d99d1c9f71453d6997e197a658 WatchSource:0}: Error finding container 81841dde96f0bd4c162af34dbaad80f9410851d99d1c9f71453d6997e197a658: Status 404 returned error can't find the container with id 81841dde96f0bd4c162af34dbaad80f9410851d99d1c9f71453d6997e197a658 Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.247712 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.248275 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.7482589 +0000 UTC m=+149.037690538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.347518 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:19 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:19 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:19 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.347577 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.352743 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.352799 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.352821 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.352862 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.352888 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.356746 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.85673172 +0000 UTC m=+149.146163358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.357295 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.360965 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.374684 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.376690 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.454397 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.454789 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:19.95477262 +0000 UTC m=+149.244204258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.511340 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5fj2x"] Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.519349 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.531365 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.556978 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.557593 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.057580777 +0000 UTC m=+149.347012405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.603189 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.659592 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.659953 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.159936651 +0000 UTC m=+149.449368279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.747345 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tnf24"] Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.763562 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.763909 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.26389376 +0000 UTC m=+149.553325398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.870872 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.871243 4873 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.871341 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.37132351 +0000 UTC m=+149.660755148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.871901 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.872411 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.372378861 +0000 UTC m=+149.661810499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.913395 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnf24" event={"ID":"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3","Type":"ContainerStarted","Data":"49ebe6c3ea35eaecd163d7a7c155a22151d195a56ce773049fc5f4d9fdced9e7"} Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.940387 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d75st" event={"ID":"2877ec4c-7a3e-4105-ac87-6d096df10661","Type":"ContainerStarted","Data":"6e0b08e9eb979cfad9507bc13238bb0d9f9a803b6e9406da1617ac3758ae9062"} Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.953712 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fj2x" event={"ID":"e52516d8-c410-4dbd-b41f-cbda11425b0e","Type":"ContainerStarted","Data":"3d229a1d7483ee232f5190406e28ea1aa38e3259959252fbb620deb657e8a447"} Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.971285 4873 generic.go:334] "Generic (PLEG): container finished" podID="d5d58373-fe5d-4afe-9da1-256843164ff4" containerID="c2f5ae8579d6418292a6c0f3c975976d1e536a3be874c8a9279ae73eeb01983a" exitCode=0 Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.972748 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jm66x" event={"ID":"d5d58373-fe5d-4afe-9da1-256843164ff4","Type":"ContainerDied","Data":"c2f5ae8579d6418292a6c0f3c975976d1e536a3be874c8a9279ae73eeb01983a"} Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.972776 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jm66x" event={"ID":"d5d58373-fe5d-4afe-9da1-256843164ff4","Type":"ContainerStarted","Data":"81841dde96f0bd4c162af34dbaad80f9410851d99d1c9f71453d6997e197a658"} Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.974580 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.975429 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.975697 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.475683402 +0000 UTC m=+149.765115040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:19 crc kubenswrapper[4873]: I0219 09:47:19.975757 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:19 crc kubenswrapper[4873]: E0219 09:47:19.976028 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.476020982 +0000 UTC m=+149.765452620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.049047 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mch8"] Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.090496 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.091895 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.591878454 +0000 UTC m=+149.881310092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.193963 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.194689 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.69467762 +0000 UTC m=+149.984109258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.295810 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.295968 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.795941193 +0000 UTC m=+150.085372841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.296430 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.296698 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.796688975 +0000 UTC m=+150.086120613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.358394 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:20 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:20 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:20 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.358458 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.399658 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.399760 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.899743719 +0000 UTC m=+150.189175357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.399965 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.400265 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:20.900258423 +0000 UTC m=+150.189690061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.500517 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.500769 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:21.000755704 +0000 UTC m=+150.290187342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.514706 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hv2j6"] Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.515651 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.520391 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.523722 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hv2j6"] Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.604440 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4gtj\" (UniqueName: \"kubernetes.io/projected/0954690a-09f0-4b1b-be57-db87e9304488-kube-api-access-f4gtj\") pod \"redhat-marketplace-hv2j6\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.604493 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-catalog-content\") pod \"redhat-marketplace-hv2j6\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.604518 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-utilities\") pod \"redhat-marketplace-hv2j6\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.604545 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.604777 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:21.104765735 +0000 UTC m=+150.394197373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: W0219 09:47:20.701353 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-d84a6c10865f58fa1503155da372265852abc2c6e32f99fd581b130b7608c666 WatchSource:0}: Error finding container d84a6c10865f58fa1503155da372265852abc2c6e32f99fd581b130b7608c666: Status 404 returned error can't find the container with id d84a6c10865f58fa1503155da372265852abc2c6e32f99fd581b130b7608c666 Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.705323 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.705487 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:21.205463462 +0000 UTC m=+150.494895100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.705596 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4gtj\" (UniqueName: \"kubernetes.io/projected/0954690a-09f0-4b1b-be57-db87e9304488-kube-api-access-f4gtj\") pod \"redhat-marketplace-hv2j6\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.705660 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-catalog-content\") pod \"redhat-marketplace-hv2j6\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.705695 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-utilities\") pod \"redhat-marketplace-hv2j6\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.705725 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.706079 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:21.206063839 +0000 UTC m=+150.495495477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.706152 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-catalog-content\") pod \"redhat-marketplace-hv2j6\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.706285 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-utilities\") pod \"redhat-marketplace-hv2j6\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.722284 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4gtj\" (UniqueName: \"kubernetes.io/projected/0954690a-09f0-4b1b-be57-db87e9304488-kube-api-access-f4gtj\") pod \"redhat-marketplace-hv2j6\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.806586 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.806731 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 09:47:21.306710464 +0000 UTC m=+150.596142102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.806955 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:20 crc kubenswrapper[4873]: E0219 09:47:20.807276 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 09:47:21.30726852 +0000 UTC m=+150.596700158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-7hhjq" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.810335 4873 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-19T09:47:19.871863236Z","Handler":null,"Name":""} Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.829947 4873 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.830206 4873 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.908046 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2jgk6"] Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.908476 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.909598 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.917325 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jgk6"] Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.921228 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.922921 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.981431 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6214ae4158cd5843d599b33ccedfce40761d283932f955047b85b89a2124b12b"} Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.981824 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d84a6c10865f58fa1503155da372265852abc2c6e32f99fd581b130b7608c666"} Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.985270 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-d75st" event={"ID":"2877ec4c-7a3e-4105-ac87-6d096df10661","Type":"ContainerStarted","Data":"67d33499573771b5affe82793ac6ad4acb85ff82981999ff3f3e82db0366b27b"} Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.986734 4873 generic.go:334] "Generic (PLEG): container finished" podID="e52516d8-c410-4dbd-b41f-cbda11425b0e" containerID="359abec79bd5f24ab125135475868fe55a53d89f41fe8e7574eec6dce9ec9eda" exitCode=0 Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.986806 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fj2x" event={"ID":"e52516d8-c410-4dbd-b41f-cbda11425b0e","Type":"ContainerDied","Data":"359abec79bd5f24ab125135475868fe55a53d89f41fe8e7574eec6dce9ec9eda"} Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.989379 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"76add2a3caef6b0ae06b59fafacc47e913c53f7e141d4a94c87f48a39f282b65"} Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.989409 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4971f0d3a202127387c09395d986632ab538b23ebe11137728b46f41aea190a7"} Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.989603 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.996334 4873 generic.go:334] "Generic (PLEG): container finished" podID="de77b9aa-b558-4431-b116-5e1e1cc116f3" containerID="e60bc2f916aff75454f8db4d5b15c6ae005baebfdcb79c0c87df06d3a9db5142" exitCode=0 Feb 19 09:47:20 crc kubenswrapper[4873]: I0219 09:47:20.996414 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" event={"ID":"de77b9aa-b558-4431-b116-5e1e1cc116f3","Type":"ContainerDied","Data":"e60bc2f916aff75454f8db4d5b15c6ae005baebfdcb79c0c87df06d3a9db5142"} Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:20.998950 4873 generic.go:334] "Generic (PLEG): container finished" podID="061e8672-31d8-48ec-87fc-158e44af91e4" containerID="0ca6133aff5d54bab32e361a7911c1c5856dd641828cf1fe1361309a8d03164b" exitCode=0 Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:20.999012 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mch8" event={"ID":"061e8672-31d8-48ec-87fc-158e44af91e4","Type":"ContainerDied","Data":"0ca6133aff5d54bab32e361a7911c1c5856dd641828cf1fe1361309a8d03164b"} Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:20.999039 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mch8" event={"ID":"061e8672-31d8-48ec-87fc-158e44af91e4","Type":"ContainerStarted","Data":"4f91cfd7b327a1281f11c3570024804d58c90780fc2519b06ceb03c7886c6273"} Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.010184 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm9f6\" (UniqueName: \"kubernetes.io/projected/e767e90e-5146-4f1e-9f0b-5f5acb185429-kube-api-access-vm9f6\") pod \"redhat-marketplace-2jgk6\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.010257 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.010418 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-catalog-content\") pod \"redhat-marketplace-2jgk6\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.010480 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-utilities\") pod \"redhat-marketplace-2jgk6\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.012901 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b06ae8c6cdc1344d37140534f4318d41b0912691854083bec1921e704d22d1ff"} Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.012945 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c751a6335010bd4fc9b0866e320452a8a243ea2f63b63cf49f21c2ab6542e8f6"} Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.015761 4873 generic.go:334] "Generic (PLEG): container finished" podID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" containerID="ec81b30df0c0381b7218bf27c1e036ec8790d0ece7bfe1e31d1cbe023c4a6210" exitCode=0 Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.015895 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnf24" event={"ID":"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3","Type":"ContainerDied","Data":"ec81b30df0c0381b7218bf27c1e036ec8790d0ece7bfe1e31d1cbe023c4a6210"} Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.042191 4873 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.042232 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.064478 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-d75st" podStartSLOduration=11.064452007 podStartE2EDuration="11.064452007s" podCreationTimestamp="2026-02-19 09:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:21.062289124 +0000 UTC m=+150.351720772" watchObservedRunningTime="2026-02-19 09:47:21.064452007 +0000 UTC m=+150.353883655" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.112340 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-catalog-content\") pod \"redhat-marketplace-2jgk6\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.112422 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-utilities\") pod \"redhat-marketplace-2jgk6\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.112481 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm9f6\" (UniqueName: \"kubernetes.io/projected/e767e90e-5146-4f1e-9f0b-5f5acb185429-kube-api-access-vm9f6\") pod \"redhat-marketplace-2jgk6\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.113508 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-catalog-content\") pod \"redhat-marketplace-2jgk6\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.117825 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-utilities\") pod \"redhat-marketplace-2jgk6\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.149259 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm9f6\" (UniqueName: \"kubernetes.io/projected/e767e90e-5146-4f1e-9f0b-5f5acb185429-kube-api-access-vm9f6\") pod \"redhat-marketplace-2jgk6\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.156009 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-7hhjq\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.192755 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hv2j6"] Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.227415 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.239834 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.313999 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gjn8l"] Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.315634 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.322277 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.326754 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjn8l"] Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.346471 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:21 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:21 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:21 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.346525 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.415375 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-catalog-content\") pod \"redhat-operators-gjn8l\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.415425 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gph4z\" (UniqueName: \"kubernetes.io/projected/7423538a-949c-4995-bcf8-f2b6a2f8d914-kube-api-access-gph4z\") pod \"redhat-operators-gjn8l\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.415493 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-utilities\") pod \"redhat-operators-gjn8l\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.481787 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jgk6"] Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.512666 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.514648 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dzcdv"] Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.515893 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.524456 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzcdv"] Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.524879 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-catalog-content\") pod \"redhat-operators-gjn8l\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.524921 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gph4z\" (UniqueName: \"kubernetes.io/projected/7423538a-949c-4995-bcf8-f2b6a2f8d914-kube-api-access-gph4z\") pod \"redhat-operators-gjn8l\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.525036 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-utilities\") pod \"redhat-operators-gjn8l\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.525463 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7hhjq"] Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.525781 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-catalog-content\") pod \"redhat-operators-gjn8l\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.525925 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-utilities\") pod \"redhat-operators-gjn8l\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:47:21 crc kubenswrapper[4873]: W0219 09:47:21.526038 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode767e90e_5146_4f1e_9f0b_5f5acb185429.slice/crio-2314d422015de89852637bb24195891fe7f3f4631802a8cc5426a4f84f3df229 WatchSource:0}: Error finding container 2314d422015de89852637bb24195891fe7f3f4631802a8cc5426a4f84f3df229: Status 404 returned error can't find the container with id 2314d422015de89852637bb24195891fe7f3f4631802a8cc5426a4f84f3df229 Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.556393 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gph4z\" (UniqueName: \"kubernetes.io/projected/7423538a-949c-4995-bcf8-f2b6a2f8d914-kube-api-access-gph4z\") pod \"redhat-operators-gjn8l\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:47:21 crc kubenswrapper[4873]: W0219 09:47:21.563349 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2948a5a7_4d94_4314_acdf_489dd93609b9.slice/crio-9186481593e0db9c07ae375e1f7f148954394edd55d25a1feea71b13835f9c08 WatchSource:0}: Error finding container 9186481593e0db9c07ae375e1f7f148954394edd55d25a1feea71b13835f9c08: Status 404 returned error can't find the container with id 9186481593e0db9c07ae375e1f7f148954394edd55d25a1feea71b13835f9c08 Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.625806 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-catalog-content\") pod \"redhat-operators-dzcdv\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.625861 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-utilities\") pod \"redhat-operators-dzcdv\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.626071 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zms24\" (UniqueName: \"kubernetes.io/projected/d152d3c6-e3c6-4255-95b5-eafe02557eb9-kube-api-access-zms24\") pod \"redhat-operators-dzcdv\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.669710 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.728629 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-catalog-content\") pod \"redhat-operators-dzcdv\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.728683 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-utilities\") pod \"redhat-operators-dzcdv\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.728731 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zms24\" (UniqueName: \"kubernetes.io/projected/d152d3c6-e3c6-4255-95b5-eafe02557eb9-kube-api-access-zms24\") pod \"redhat-operators-dzcdv\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.729304 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-catalog-content\") pod \"redhat-operators-dzcdv\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.729397 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-utilities\") pod \"redhat-operators-dzcdv\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.766077 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zms24\" (UniqueName: \"kubernetes.io/projected/d152d3c6-e3c6-4255-95b5-eafe02557eb9-kube-api-access-zms24\") pod \"redhat-operators-dzcdv\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:47:21 crc kubenswrapper[4873]: I0219 09:47:21.868408 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.030977 4873 generic.go:334] "Generic (PLEG): container finished" podID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerID="3d19858345e60805d3d74034c65385dcd27dffb04874a42aed209e62e4d2c8a6" exitCode=0 Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.031168 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jgk6" event={"ID":"e767e90e-5146-4f1e-9f0b-5f5acb185429","Type":"ContainerDied","Data":"3d19858345e60805d3d74034c65385dcd27dffb04874a42aed209e62e4d2c8a6"} Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.031681 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jgk6" event={"ID":"e767e90e-5146-4f1e-9f0b-5f5acb185429","Type":"ContainerStarted","Data":"2314d422015de89852637bb24195891fe7f3f4631802a8cc5426a4f84f3df229"} Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.035047 4873 generic.go:334] "Generic (PLEG): container finished" podID="0954690a-09f0-4b1b-be57-db87e9304488" containerID="8612f8546567d73e69164de7ce77990ef80b0c0484ccccf94d45b84576d5ac5b" exitCode=0 Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.035227 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv2j6" event={"ID":"0954690a-09f0-4b1b-be57-db87e9304488","Type":"ContainerDied","Data":"8612f8546567d73e69164de7ce77990ef80b0c0484ccccf94d45b84576d5ac5b"} Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.035271 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv2j6" event={"ID":"0954690a-09f0-4b1b-be57-db87e9304488","Type":"ContainerStarted","Data":"edab2539f7fc8755b323d06c9cc87b6333d411f7bbacd04da485c28f244826a3"} Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.044983 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" event={"ID":"2948a5a7-4d94-4314-acdf-489dd93609b9","Type":"ContainerStarted","Data":"1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc"} Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.045024 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.045038 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" event={"ID":"2948a5a7-4d94-4314-acdf-489dd93609b9","Type":"ContainerStarted","Data":"9186481593e0db9c07ae375e1f7f148954394edd55d25a1feea71b13835f9c08"} Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.103988 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" podStartSLOduration=125.103960894 podStartE2EDuration="2m5.103960894s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:22.103119709 +0000 UTC m=+151.392551347" watchObservedRunningTime="2026-02-19 09:47:22.103960894 +0000 UTC m=+151.393392532" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.168628 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzcdv"] Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.261341 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gk5mg" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.346910 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:22 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:22 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:22 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.347206 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.422531 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.482153 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjn8l"] Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.551177 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj2jk\" (UniqueName: \"kubernetes.io/projected/de77b9aa-b558-4431-b116-5e1e1cc116f3-kube-api-access-nj2jk\") pod \"de77b9aa-b558-4431-b116-5e1e1cc116f3\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.551690 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de77b9aa-b558-4431-b116-5e1e1cc116f3-secret-volume\") pod \"de77b9aa-b558-4431-b116-5e1e1cc116f3\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.551787 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de77b9aa-b558-4431-b116-5e1e1cc116f3-config-volume\") pod \"de77b9aa-b558-4431-b116-5e1e1cc116f3\" (UID: \"de77b9aa-b558-4431-b116-5e1e1cc116f3\") " Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.552499 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de77b9aa-b558-4431-b116-5e1e1cc116f3-config-volume" (OuterVolumeSpecName: "config-volume") pod "de77b9aa-b558-4431-b116-5e1e1cc116f3" (UID: "de77b9aa-b558-4431-b116-5e1e1cc116f3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.552634 4873 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de77b9aa-b558-4431-b116-5e1e1cc116f3-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.561534 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de77b9aa-b558-4431-b116-5e1e1cc116f3-kube-api-access-nj2jk" (OuterVolumeSpecName: "kube-api-access-nj2jk") pod "de77b9aa-b558-4431-b116-5e1e1cc116f3" (UID: "de77b9aa-b558-4431-b116-5e1e1cc116f3"). InnerVolumeSpecName "kube-api-access-nj2jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.562255 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de77b9aa-b558-4431-b116-5e1e1cc116f3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "de77b9aa-b558-4431-b116-5e1e1cc116f3" (UID: "de77b9aa-b558-4431-b116-5e1e1cc116f3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.619498 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 09:47:22 crc kubenswrapper[4873]: E0219 09:47:22.619815 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de77b9aa-b558-4431-b116-5e1e1cc116f3" containerName="collect-profiles" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.619835 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="de77b9aa-b558-4431-b116-5e1e1cc116f3" containerName="collect-profiles" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.619941 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="de77b9aa-b558-4431-b116-5e1e1cc116f3" containerName="collect-profiles" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.620469 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.625446 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.625491 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.625608 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.653590 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.653639 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.653704 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj2jk\" (UniqueName: \"kubernetes.io/projected/de77b9aa-b558-4431-b116-5e1e1cc116f3-kube-api-access-nj2jk\") on node \"crc\" DevicePath \"\"" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.653716 4873 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de77b9aa-b558-4431-b116-5e1e1cc116f3-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.723371 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.728779 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.733875 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.734299 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.740860 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.754126 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.754168 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.754215 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.754253 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.754371 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.781483 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.855380 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.855464 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.855530 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.892848 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.931889 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.940251 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gbzll" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.944307 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.944357 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.946651 4873 patch_prober.go:28] interesting pod/console-f9d7485db-shnwj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.946729 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-shnwj" podUID="10aa25f4-7549-468a-b42f-19305ad066dd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 19 09:47:22 crc kubenswrapper[4873]: I0219 09:47:22.960182 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.050973 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.067428 4873 generic.go:334] "Generic (PLEG): container finished" podID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerID="75a546ab60f91886bf73906724d9833647cf46b858664cd39c852a73088064e8" exitCode=0 Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.067515 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzcdv" event={"ID":"d152d3c6-e3c6-4255-95b5-eafe02557eb9","Type":"ContainerDied","Data":"75a546ab60f91886bf73906724d9833647cf46b858664cd39c852a73088064e8"} Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.067543 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzcdv" event={"ID":"d152d3c6-e3c6-4255-95b5-eafe02557eb9","Type":"ContainerStarted","Data":"81340de24ca383dbb41a0340acf197019d868e5563832a3950dc50b33c15f087"} Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.133401 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" event={"ID":"de77b9aa-b558-4431-b116-5e1e1cc116f3","Type":"ContainerDied","Data":"a018522e013b75a19d6f1ebe089ac24d73537d03912cdb27eb2e286e6cfe33f1"} Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.133437 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a018522e013b75a19d6f1ebe089ac24d73537d03912cdb27eb2e286e6cfe33f1" Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.133474 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw" Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.147694 4873 generic.go:334] "Generic (PLEG): container finished" podID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerID="9629d2d8b3e9b42462cd22dea41c2d0741500af9092d6a0c14427b12d6440d6e" exitCode=0 Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.147996 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjn8l" event={"ID":"7423538a-949c-4995-bcf8-f2b6a2f8d914","Type":"ContainerDied","Data":"9629d2d8b3e9b42462cd22dea41c2d0741500af9092d6a0c14427b12d6440d6e"} Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.148116 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjn8l" event={"ID":"7423538a-949c-4995-bcf8-f2b6a2f8d914","Type":"ContainerStarted","Data":"5bca84b1a6668c5e7d3c16b7d1810bc8d1542096d34580cd77564b1a69e0e7cc"} Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.294929 4873 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pq25 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.295237 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9pq25" podUID="e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.295681 4873 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pq25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.295698 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pq25" podUID="e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.339055 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.341979 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:23 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:23 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:23 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.342029 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.591839 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.732621 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:47:23 crc kubenswrapper[4873]: I0219 09:47:23.770814 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 09:47:23 crc kubenswrapper[4873]: W0219 09:47:23.809465 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda74b91a5_c78a_4bd3_92d5_1fd2b7237aca.slice/crio-a97cf67854674d49088004bb367b4075c36af7aaa720457ef826861e4ac9c3e1 WatchSource:0}: Error finding container a97cf67854674d49088004bb367b4075c36af7aaa720457ef826861e4ac9c3e1: Status 404 returned error can't find the container with id a97cf67854674d49088004bb367b4075c36af7aaa720457ef826861e4ac9c3e1 Feb 19 09:47:24 crc kubenswrapper[4873]: I0219 09:47:24.173583 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee","Type":"ContainerStarted","Data":"1c55529d8e541c248142ab8026e5850fd051ee5e23de0e342f36d4de091fd688"} Feb 19 09:47:24 crc kubenswrapper[4873]: I0219 09:47:24.179690 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca","Type":"ContainerStarted","Data":"a97cf67854674d49088004bb367b4075c36af7aaa720457ef826861e4ac9c3e1"} Feb 19 09:47:24 crc kubenswrapper[4873]: I0219 09:47:24.341847 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:24 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:24 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:24 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:24 crc kubenswrapper[4873]: I0219 09:47:24.341907 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:25 crc kubenswrapper[4873]: I0219 09:47:25.192280 4873 generic.go:334] "Generic (PLEG): container finished" podID="d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee" containerID="312932e6b0041321c21e003869fd651d5204223b0419fc47ab0b3813ba249bdc" exitCode=0 Feb 19 09:47:25 crc kubenswrapper[4873]: I0219 09:47:25.192527 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee","Type":"ContainerDied","Data":"312932e6b0041321c21e003869fd651d5204223b0419fc47ab0b3813ba249bdc"} Feb 19 09:47:25 crc kubenswrapper[4873]: I0219 09:47:25.217312 4873 generic.go:334] "Generic (PLEG): container finished" podID="a74b91a5-c78a-4bd3-92d5-1fd2b7237aca" containerID="0979056b50e27f65b536fb51d74762974f9a878992d013ad8c3a0ca1ed6ca214" exitCode=0 Feb 19 09:47:25 crc kubenswrapper[4873]: I0219 09:47:25.217361 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca","Type":"ContainerDied","Data":"0979056b50e27f65b536fb51d74762974f9a878992d013ad8c3a0ca1ed6ca214"} Feb 19 09:47:25 crc kubenswrapper[4873]: I0219 09:47:25.342433 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:25 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:25 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:25 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:25 crc kubenswrapper[4873]: I0219 09:47:25.342501 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.124943 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.342547 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:26 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:26 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:26 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.342599 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.779161 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.795288 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.955896 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kube-api-access\") pod \"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca\" (UID: \"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca\") " Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.955988 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kubelet-dir\") pod \"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca\" (UID: \"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca\") " Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.956011 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kubelet-dir\") pod \"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee\" (UID: \"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee\") " Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.956047 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kube-api-access\") pod \"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee\" (UID: \"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee\") " Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.960771 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a74b91a5-c78a-4bd3-92d5-1fd2b7237aca" (UID: "a74b91a5-c78a-4bd3-92d5-1fd2b7237aca"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.960862 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee" (UID: "d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.982729 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee" (UID: "d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:47:26 crc kubenswrapper[4873]: I0219 09:47:26.982793 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a74b91a5-c78a-4bd3-92d5-1fd2b7237aca" (UID: "a74b91a5-c78a-4bd3-92d5-1fd2b7237aca"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.058276 4873 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.058312 4873 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.058320 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.058330 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a74b91a5-c78a-4bd3-92d5-1fd2b7237aca-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.241688 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a74b91a5-c78a-4bd3-92d5-1fd2b7237aca","Type":"ContainerDied","Data":"a97cf67854674d49088004bb367b4075c36af7aaa720457ef826861e4ac9c3e1"} Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.241726 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a97cf67854674d49088004bb367b4075c36af7aaa720457ef826861e4ac9c3e1" Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.241773 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.247948 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee","Type":"ContainerDied","Data":"1c55529d8e541c248142ab8026e5850fd051ee5e23de0e342f36d4de091fd688"} Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.247987 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c55529d8e541c248142ab8026e5850fd051ee5e23de0e342f36d4de091fd688" Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.250342 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.343641 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:27 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:27 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:27 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:27 crc kubenswrapper[4873]: I0219 09:47:27.343703 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:28 crc kubenswrapper[4873]: I0219 09:47:28.341873 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:28 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:28 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:28 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:28 crc kubenswrapper[4873]: I0219 09:47:28.342398 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:28 crc kubenswrapper[4873]: I0219 09:47:28.806625 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mv87q" Feb 19 09:47:29 crc kubenswrapper[4873]: I0219 09:47:29.341072 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:29 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:29 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:29 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:29 crc kubenswrapper[4873]: I0219 09:47:29.341159 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:30 crc kubenswrapper[4873]: I0219 09:47:30.342928 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:30 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:30 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:30 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:30 crc kubenswrapper[4873]: I0219 09:47:30.343280 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:31 crc kubenswrapper[4873]: I0219 09:47:31.341348 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:31 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:31 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:31 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:31 crc kubenswrapper[4873]: I0219 09:47:31.341395 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:32 crc kubenswrapper[4873]: I0219 09:47:32.343985 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:32 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:32 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:32 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:32 crc kubenswrapper[4873]: I0219 09:47:32.344253 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:32 crc kubenswrapper[4873]: I0219 09:47:32.944369 4873 patch_prober.go:28] interesting pod/console-f9d7485db-shnwj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 19 09:47:32 crc kubenswrapper[4873]: I0219 09:47:32.944456 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-shnwj" podUID="10aa25f4-7549-468a-b42f-19305ad066dd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 19 09:47:33 crc kubenswrapper[4873]: I0219 09:47:33.294552 4873 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pq25 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 19 09:47:33 crc kubenswrapper[4873]: I0219 09:47:33.294612 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pq25" podUID="e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 19 09:47:33 crc kubenswrapper[4873]: I0219 09:47:33.294847 4873 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pq25 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 19 09:47:33 crc kubenswrapper[4873]: I0219 09:47:33.294930 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9pq25" podUID="e0f6a9a1-70e5-46ce-97aa-3dc9d26c672e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 19 09:47:33 crc kubenswrapper[4873]: I0219 09:47:33.341704 4873 patch_prober.go:28] interesting pod/router-default-5444994796-kzpbf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 09:47:33 crc kubenswrapper[4873]: [-]has-synced failed: reason withheld Feb 19 09:47:33 crc kubenswrapper[4873]: [+]process-running ok Feb 19 09:47:33 crc kubenswrapper[4873]: healthz check failed Feb 19 09:47:33 crc kubenswrapper[4873]: I0219 09:47:33.341777 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kzpbf" podUID="34f3caca-1b4c-493d-a10b-277b42d7ce72" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 09:47:34 crc kubenswrapper[4873]: I0219 09:47:34.342831 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:34 crc kubenswrapper[4873]: I0219 09:47:34.346832 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-kzpbf" Feb 19 09:47:39 crc kubenswrapper[4873]: I0219 09:47:39.896648 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:47:39 crc kubenswrapper[4873]: I0219 09:47:39.919633 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98d35597-056d-48f0-b599-28b098dd45f3-metrics-certs\") pod \"network-metrics-daemon-lcp8k\" (UID: \"98d35597-056d-48f0-b599-28b098dd45f3\") " pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:47:39 crc kubenswrapper[4873]: I0219 09:47:39.944256 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lcp8k" Feb 19 09:47:41 crc kubenswrapper[4873]: I0219 09:47:41.246883 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:47:42 crc kubenswrapper[4873]: I0219 09:47:42.950635 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:42 crc kubenswrapper[4873]: I0219 09:47:42.956166 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:47:43 crc kubenswrapper[4873]: I0219 09:47:43.299057 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9pq25" Feb 19 09:47:48 crc kubenswrapper[4873]: I0219 09:47:48.240585 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:47:48 crc kubenswrapper[4873]: I0219 09:47:48.241179 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:47:51 crc kubenswrapper[4873]: I0219 09:47:51.591867 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g545"] Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.523151 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.523366 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hc4qh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5fj2x_openshift-marketplace(e52516d8-c410-4dbd-b41f-cbda11425b0e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.524692 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5fj2x" podUID="e52516d8-c410-4dbd-b41f-cbda11425b0e" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.621423 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.622026 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vm9f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2jgk6_openshift-marketplace(e767e90e-5146-4f1e-9f0b-5f5acb185429): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.623194 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2jgk6" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.680769 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.680916 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vnz6w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tnf24_openshift-marketplace(9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.682090 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tnf24" podUID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.689904 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.690138 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7sl6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jm66x_openshift-marketplace(d5d58373-fe5d-4afe-9da1-256843164ff4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.691327 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jm66x" podUID="d5d58373-fe5d-4afe-9da1-256843164ff4" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.807613 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.807770 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zms24,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dzcdv_openshift-marketplace(d152d3c6-e3c6-4255-95b5-eafe02557eb9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 09:47:52 crc kubenswrapper[4873]: E0219 09:47:52.810416 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dzcdv" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" Feb 19 09:47:52 crc kubenswrapper[4873]: I0219 09:47:52.973333 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lcp8k"] Feb 19 09:47:52 crc kubenswrapper[4873]: W0219 09:47:52.978161 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98d35597_056d_48f0_b599_28b098dd45f3.slice/crio-2a9f95e854e8de082c3f0b3167bc0a8f9ba7069fecd049749001c044fa20cb3d WatchSource:0}: Error finding container 2a9f95e854e8de082c3f0b3167bc0a8f9ba7069fecd049749001c044fa20cb3d: Status 404 returned error can't find the container with id 2a9f95e854e8de082c3f0b3167bc0a8f9ba7069fecd049749001c044fa20cb3d Feb 19 09:47:52 crc kubenswrapper[4873]: I0219 09:47:52.992731 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" event={"ID":"98d35597-056d-48f0-b599-28b098dd45f3","Type":"ContainerStarted","Data":"2a9f95e854e8de082c3f0b3167bc0a8f9ba7069fecd049749001c044fa20cb3d"} Feb 19 09:47:52 crc kubenswrapper[4873]: I0219 09:47:52.994730 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mch8" event={"ID":"061e8672-31d8-48ec-87fc-158e44af91e4","Type":"ContainerStarted","Data":"dde867d0270d4b3c266ef7d78245dcb4ac5bc44881658ebda50b8400fe4b5ef0"} Feb 19 09:47:52 crc kubenswrapper[4873]: I0219 09:47:52.998500 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv2j6" event={"ID":"0954690a-09f0-4b1b-be57-db87e9304488","Type":"ContainerStarted","Data":"6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9"} Feb 19 09:47:53 crc kubenswrapper[4873]: I0219 09:47:53.000593 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjn8l" event={"ID":"7423538a-949c-4995-bcf8-f2b6a2f8d914","Type":"ContainerStarted","Data":"e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2"} Feb 19 09:47:53 crc kubenswrapper[4873]: E0219 09:47:53.001905 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5fj2x" podUID="e52516d8-c410-4dbd-b41f-cbda11425b0e" Feb 19 09:47:53 crc kubenswrapper[4873]: E0219 09:47:53.004076 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jm66x" podUID="d5d58373-fe5d-4afe-9da1-256843164ff4" Feb 19 09:47:53 crc kubenswrapper[4873]: E0219 09:47:53.004123 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2jgk6" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" Feb 19 09:47:53 crc kubenswrapper[4873]: E0219 09:47:53.004159 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tnf24" podUID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" Feb 19 09:47:53 crc kubenswrapper[4873]: E0219 09:47:53.004603 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dzcdv" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" Feb 19 09:47:53 crc kubenswrapper[4873]: I0219 09:47:53.722924 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jjkrt" Feb 19 09:47:54 crc kubenswrapper[4873]: I0219 09:47:54.007359 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" event={"ID":"98d35597-056d-48f0-b599-28b098dd45f3","Type":"ContainerStarted","Data":"eff0e3f7ead72779407a2eece5581ea236489792ab0ad02d23a186abcf664e2c"} Feb 19 09:47:54 crc kubenswrapper[4873]: I0219 09:47:54.008078 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lcp8k" event={"ID":"98d35597-056d-48f0-b599-28b098dd45f3","Type":"ContainerStarted","Data":"c66ebb5bcebeb69e28284dccb73ad3ce6d73f6d7358e10e1e7a0c630ec2629d4"} Feb 19 09:47:54 crc kubenswrapper[4873]: I0219 09:47:54.009166 4873 generic.go:334] "Generic (PLEG): container finished" podID="061e8672-31d8-48ec-87fc-158e44af91e4" containerID="dde867d0270d4b3c266ef7d78245dcb4ac5bc44881658ebda50b8400fe4b5ef0" exitCode=0 Feb 19 09:47:54 crc kubenswrapper[4873]: I0219 09:47:54.009230 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mch8" event={"ID":"061e8672-31d8-48ec-87fc-158e44af91e4","Type":"ContainerDied","Data":"dde867d0270d4b3c266ef7d78245dcb4ac5bc44881658ebda50b8400fe4b5ef0"} Feb 19 09:47:54 crc kubenswrapper[4873]: I0219 09:47:54.011112 4873 generic.go:334] "Generic (PLEG): container finished" podID="0954690a-09f0-4b1b-be57-db87e9304488" containerID="6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9" exitCode=0 Feb 19 09:47:54 crc kubenswrapper[4873]: I0219 09:47:54.011166 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv2j6" event={"ID":"0954690a-09f0-4b1b-be57-db87e9304488","Type":"ContainerDied","Data":"6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9"} Feb 19 09:47:54 crc kubenswrapper[4873]: I0219 09:47:54.013713 4873 generic.go:334] "Generic (PLEG): container finished" podID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerID="e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2" exitCode=0 Feb 19 09:47:54 crc kubenswrapper[4873]: I0219 09:47:54.013741 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjn8l" event={"ID":"7423538a-949c-4995-bcf8-f2b6a2f8d914","Type":"ContainerDied","Data":"e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2"} Feb 19 09:47:54 crc kubenswrapper[4873]: I0219 09:47:54.045012 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lcp8k" podStartSLOduration=157.0449937 podStartE2EDuration="2m37.0449937s" podCreationTimestamp="2026-02-19 09:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:47:54.028034682 +0000 UTC m=+183.317466320" watchObservedRunningTime="2026-02-19 09:47:54.0449937 +0000 UTC m=+183.334425338" Feb 19 09:47:55 crc kubenswrapper[4873]: I0219 09:47:55.021262 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mch8" event={"ID":"061e8672-31d8-48ec-87fc-158e44af91e4","Type":"ContainerStarted","Data":"2dca850c88c97df8745ed0cd2c09857330c9ddf17d65678c2f357f2f1bc1105f"} Feb 19 09:47:55 crc kubenswrapper[4873]: I0219 09:47:55.023418 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv2j6" event={"ID":"0954690a-09f0-4b1b-be57-db87e9304488","Type":"ContainerStarted","Data":"74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba"} Feb 19 09:47:55 crc kubenswrapper[4873]: I0219 09:47:55.032969 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjn8l" event={"ID":"7423538a-949c-4995-bcf8-f2b6a2f8d914","Type":"ContainerStarted","Data":"f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0"} Feb 19 09:47:55 crc kubenswrapper[4873]: I0219 09:47:55.077180 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8mch8" podStartSLOduration=3.5553230940000002 podStartE2EDuration="37.077145985s" podCreationTimestamp="2026-02-19 09:47:18 +0000 UTC" firstStartedPulling="2026-02-19 09:47:21.003528114 +0000 UTC m=+150.292959752" lastFinishedPulling="2026-02-19 09:47:54.525350995 +0000 UTC m=+183.814782643" observedRunningTime="2026-02-19 09:47:55.072946574 +0000 UTC m=+184.362378212" watchObservedRunningTime="2026-02-19 09:47:55.077145985 +0000 UTC m=+184.366577623" Feb 19 09:47:55 crc kubenswrapper[4873]: I0219 09:47:55.112891 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hv2j6" podStartSLOduration=2.647876588 podStartE2EDuration="35.112873953s" podCreationTimestamp="2026-02-19 09:47:20 +0000 UTC" firstStartedPulling="2026-02-19 09:47:22.040983553 +0000 UTC m=+151.330415191" lastFinishedPulling="2026-02-19 09:47:54.505980918 +0000 UTC m=+183.795412556" observedRunningTime="2026-02-19 09:47:55.111676118 +0000 UTC m=+184.401107756" watchObservedRunningTime="2026-02-19 09:47:55.112873953 +0000 UTC m=+184.402305591" Feb 19 09:47:55 crc kubenswrapper[4873]: I0219 09:47:55.149419 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gjn8l" podStartSLOduration=2.865984052 podStartE2EDuration="34.149397653s" podCreationTimestamp="2026-02-19 09:47:21 +0000 UTC" firstStartedPulling="2026-02-19 09:47:23.156715332 +0000 UTC m=+152.446146970" lastFinishedPulling="2026-02-19 09:47:54.440128933 +0000 UTC m=+183.729560571" observedRunningTime="2026-02-19 09:47:55.147794837 +0000 UTC m=+184.437226475" watchObservedRunningTime="2026-02-19 09:47:55.149397653 +0000 UTC m=+184.438829281" Feb 19 09:47:59 crc kubenswrapper[4873]: I0219 09:47:59.096898 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:59 crc kubenswrapper[4873]: I0219 09:47:59.098076 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:59 crc kubenswrapper[4873]: I0219 09:47:59.310488 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:47:59 crc kubenswrapper[4873]: I0219 09:47:59.609570 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 09:48:00 crc kubenswrapper[4873]: I0219 09:48:00.090590 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:48:00 crc kubenswrapper[4873]: I0219 09:48:00.921345 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:48:00 crc kubenswrapper[4873]: I0219 09:48:00.923150 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:48:00 crc kubenswrapper[4873]: I0219 09:48:00.962727 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.098005 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.268821 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 09:48:01 crc kubenswrapper[4873]: E0219 09:48:01.269144 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee" containerName="pruner" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.269159 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee" containerName="pruner" Feb 19 09:48:01 crc kubenswrapper[4873]: E0219 09:48:01.269177 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74b91a5-c78a-4bd3-92d5-1fd2b7237aca" containerName="pruner" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.269186 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74b91a5-c78a-4bd3-92d5-1fd2b7237aca" containerName="pruner" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.269314 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5411b1f-9d3d-46b7-b1ed-5f8e811f68ee" containerName="pruner" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.269330 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74b91a5-c78a-4bd3-92d5-1fd2b7237aca" containerName="pruner" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.269765 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.272287 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.272457 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.279867 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.357618 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c21b431f-3ddc-4b17-b162-d39ec7981ec3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.357654 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c21b431f-3ddc-4b17-b162-d39ec7981ec3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.426768 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mch8"] Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.460579 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c21b431f-3ddc-4b17-b162-d39ec7981ec3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.460644 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c21b431f-3ddc-4b17-b162-d39ec7981ec3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.461195 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c21b431f-3ddc-4b17-b162-d39ec7981ec3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.485891 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c21b431f-3ddc-4b17-b162-d39ec7981ec3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.637548 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.670995 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.671427 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:48:01 crc kubenswrapper[4873]: I0219 09:48:01.726342 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:48:02 crc kubenswrapper[4873]: I0219 09:48:02.067694 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8mch8" podUID="061e8672-31d8-48ec-87fc-158e44af91e4" containerName="registry-server" containerID="cri-o://2dca850c88c97df8745ed0cd2c09857330c9ddf17d65678c2f357f2f1bc1105f" gracePeriod=2 Feb 19 09:48:02 crc kubenswrapper[4873]: I0219 09:48:02.068210 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 09:48:02 crc kubenswrapper[4873]: W0219 09:48:02.074939 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc21b431f_3ddc_4b17_b162_d39ec7981ec3.slice/crio-3c0e0298ddffb29ceba29d4f153841c4c537720b187ecb029d5a54190b91e7b5 WatchSource:0}: Error finding container 3c0e0298ddffb29ceba29d4f153841c4c537720b187ecb029d5a54190b91e7b5: Status 404 returned error can't find the container with id 3c0e0298ddffb29ceba29d4f153841c4c537720b187ecb029d5a54190b91e7b5 Feb 19 09:48:02 crc kubenswrapper[4873]: I0219 09:48:02.129232 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:48:03 crc kubenswrapper[4873]: I0219 09:48:03.072015 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c21b431f-3ddc-4b17-b162-d39ec7981ec3","Type":"ContainerStarted","Data":"3c0e0298ddffb29ceba29d4f153841c4c537720b187ecb029d5a54190b91e7b5"} Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.080020 4873 generic.go:334] "Generic (PLEG): container finished" podID="061e8672-31d8-48ec-87fc-158e44af91e4" containerID="2dca850c88c97df8745ed0cd2c09857330c9ddf17d65678c2f357f2f1bc1105f" exitCode=0 Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.080151 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mch8" event={"ID":"061e8672-31d8-48ec-87fc-158e44af91e4","Type":"ContainerDied","Data":"2dca850c88c97df8745ed0cd2c09857330c9ddf17d65678c2f357f2f1bc1105f"} Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.081390 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c21b431f-3ddc-4b17-b162-d39ec7981ec3","Type":"ContainerStarted","Data":"2daef70c96b51618fa39301e65d6fec67bde37d7da8c4868ef080055f31a9403"} Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.098096 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=3.098070743 podStartE2EDuration="3.098070743s" podCreationTimestamp="2026-02-19 09:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:48:04.092881664 +0000 UTC m=+193.382313302" watchObservedRunningTime="2026-02-19 09:48:04.098070743 +0000 UTC m=+193.387502421" Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.357286 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.406397 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-utilities\") pod \"061e8672-31d8-48ec-87fc-158e44af91e4\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.406456 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq6xh\" (UniqueName: \"kubernetes.io/projected/061e8672-31d8-48ec-87fc-158e44af91e4-kube-api-access-kq6xh\") pod \"061e8672-31d8-48ec-87fc-158e44af91e4\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.406559 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-catalog-content\") pod \"061e8672-31d8-48ec-87fc-158e44af91e4\" (UID: \"061e8672-31d8-48ec-87fc-158e44af91e4\") " Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.407283 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-utilities" (OuterVolumeSpecName: "utilities") pod "061e8672-31d8-48ec-87fc-158e44af91e4" (UID: "061e8672-31d8-48ec-87fc-158e44af91e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.411656 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061e8672-31d8-48ec-87fc-158e44af91e4-kube-api-access-kq6xh" (OuterVolumeSpecName: "kube-api-access-kq6xh") pod "061e8672-31d8-48ec-87fc-158e44af91e4" (UID: "061e8672-31d8-48ec-87fc-158e44af91e4"). InnerVolumeSpecName "kube-api-access-kq6xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.466936 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "061e8672-31d8-48ec-87fc-158e44af91e4" (UID: "061e8672-31d8-48ec-87fc-158e44af91e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.507484 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.507513 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq6xh\" (UniqueName: \"kubernetes.io/projected/061e8672-31d8-48ec-87fc-158e44af91e4-kube-api-access-kq6xh\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:04 crc kubenswrapper[4873]: I0219 09:48:04.507524 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061e8672-31d8-48ec-87fc-158e44af91e4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:05 crc kubenswrapper[4873]: I0219 09:48:05.087770 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mch8" event={"ID":"061e8672-31d8-48ec-87fc-158e44af91e4","Type":"ContainerDied","Data":"4f91cfd7b327a1281f11c3570024804d58c90780fc2519b06ceb03c7886c6273"} Feb 19 09:48:05 crc kubenswrapper[4873]: I0219 09:48:05.087835 4873 scope.go:117] "RemoveContainer" containerID="2dca850c88c97df8745ed0cd2c09857330c9ddf17d65678c2f357f2f1bc1105f" Feb 19 09:48:05 crc kubenswrapper[4873]: I0219 09:48:05.087792 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mch8" Feb 19 09:48:05 crc kubenswrapper[4873]: I0219 09:48:05.089934 4873 generic.go:334] "Generic (PLEG): container finished" podID="c21b431f-3ddc-4b17-b162-d39ec7981ec3" containerID="2daef70c96b51618fa39301e65d6fec67bde37d7da8c4868ef080055f31a9403" exitCode=0 Feb 19 09:48:05 crc kubenswrapper[4873]: I0219 09:48:05.089968 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c21b431f-3ddc-4b17-b162-d39ec7981ec3","Type":"ContainerDied","Data":"2daef70c96b51618fa39301e65d6fec67bde37d7da8c4868ef080055f31a9403"} Feb 19 09:48:05 crc kubenswrapper[4873]: I0219 09:48:05.109392 4873 scope.go:117] "RemoveContainer" containerID="dde867d0270d4b3c266ef7d78245dcb4ac5bc44881658ebda50b8400fe4b5ef0" Feb 19 09:48:05 crc kubenswrapper[4873]: I0219 09:48:05.135363 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mch8"] Feb 19 09:48:05 crc kubenswrapper[4873]: I0219 09:48:05.138766 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8mch8"] Feb 19 09:48:05 crc kubenswrapper[4873]: I0219 09:48:05.141476 4873 scope.go:117] "RemoveContainer" containerID="0ca6133aff5d54bab32e361a7911c1c5856dd641828cf1fe1361309a8d03164b" Feb 19 09:48:05 crc kubenswrapper[4873]: I0219 09:48:05.490950 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="061e8672-31d8-48ec-87fc-158e44af91e4" path="/var/lib/kubelet/pods/061e8672-31d8-48ec-87fc-158e44af91e4/volumes" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.400391 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.428500 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kubelet-dir\") pod \"c21b431f-3ddc-4b17-b162-d39ec7981ec3\" (UID: \"c21b431f-3ddc-4b17-b162-d39ec7981ec3\") " Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.428558 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kube-api-access\") pod \"c21b431f-3ddc-4b17-b162-d39ec7981ec3\" (UID: \"c21b431f-3ddc-4b17-b162-d39ec7981ec3\") " Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.428652 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c21b431f-3ddc-4b17-b162-d39ec7981ec3" (UID: "c21b431f-3ddc-4b17-b162-d39ec7981ec3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.434845 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c21b431f-3ddc-4b17-b162-d39ec7981ec3" (UID: "c21b431f-3ddc-4b17-b162-d39ec7981ec3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.529740 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.529784 4873 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c21b431f-3ddc-4b17-b162-d39ec7981ec3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.881400 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 09:48:06 crc kubenswrapper[4873]: E0219 09:48:06.881750 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061e8672-31d8-48ec-87fc-158e44af91e4" containerName="extract-content" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.881774 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="061e8672-31d8-48ec-87fc-158e44af91e4" containerName="extract-content" Feb 19 09:48:06 crc kubenswrapper[4873]: E0219 09:48:06.881798 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21b431f-3ddc-4b17-b162-d39ec7981ec3" containerName="pruner" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.881806 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21b431f-3ddc-4b17-b162-d39ec7981ec3" containerName="pruner" Feb 19 09:48:06 crc kubenswrapper[4873]: E0219 09:48:06.881822 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061e8672-31d8-48ec-87fc-158e44af91e4" containerName="extract-utilities" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.881831 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="061e8672-31d8-48ec-87fc-158e44af91e4" containerName="extract-utilities" Feb 19 09:48:06 crc kubenswrapper[4873]: E0219 09:48:06.881843 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061e8672-31d8-48ec-87fc-158e44af91e4" containerName="registry-server" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.881851 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="061e8672-31d8-48ec-87fc-158e44af91e4" containerName="registry-server" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.881976 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21b431f-3ddc-4b17-b162-d39ec7981ec3" containerName="pruner" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.881990 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="061e8672-31d8-48ec-87fc-158e44af91e4" containerName="registry-server" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.882513 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.886311 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.933826 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.933887 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-var-lock\") pod \"installer-9-crc\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:06 crc kubenswrapper[4873]: I0219 09:48:06.933905 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd45a6e-fa80-4995-bab8-20796784d618-kube-api-access\") pod \"installer-9-crc\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.035309 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-var-lock\") pod \"installer-9-crc\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.035388 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd45a6e-fa80-4995-bab8-20796784d618-kube-api-access\") pod \"installer-9-crc\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.035449 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-var-lock\") pod \"installer-9-crc\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.035486 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.035538 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.053793 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd45a6e-fa80-4995-bab8-20796784d618-kube-api-access\") pod \"installer-9-crc\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.110769 4873 generic.go:334] "Generic (PLEG): container finished" podID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerID="3925b9cd7df38893cf6f1abc778ceaaf22660b5582a02ffd58d2352d46ffbced" exitCode=0 Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.110845 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzcdv" event={"ID":"d152d3c6-e3c6-4255-95b5-eafe02557eb9","Type":"ContainerDied","Data":"3925b9cd7df38893cf6f1abc778ceaaf22660b5582a02ffd58d2352d46ffbced"} Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.114747 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c21b431f-3ddc-4b17-b162-d39ec7981ec3","Type":"ContainerDied","Data":"3c0e0298ddffb29ceba29d4f153841c4c537720b187ecb029d5a54190b91e7b5"} Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.114787 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c0e0298ddffb29ceba29d4f153841c4c537720b187ecb029d5a54190b91e7b5" Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.114823 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.196453 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:07 crc kubenswrapper[4873]: I0219 09:48:07.622334 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 09:48:07 crc kubenswrapper[4873]: W0219 09:48:07.633010 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddcd45a6e_fa80_4995_bab8_20796784d618.slice/crio-3520c7c7b57ec4d387d5a52af5a8868db2183fc59373efaee14a1f7ca6894456 WatchSource:0}: Error finding container 3520c7c7b57ec4d387d5a52af5a8868db2183fc59373efaee14a1f7ca6894456: Status 404 returned error can't find the container with id 3520c7c7b57ec4d387d5a52af5a8868db2183fc59373efaee14a1f7ca6894456 Feb 19 09:48:08 crc kubenswrapper[4873]: I0219 09:48:08.121505 4873 generic.go:334] "Generic (PLEG): container finished" podID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" containerID="1479a6032ed4c9b61367740ab02f5b5f34a581176449601b359feaf96fa47f0a" exitCode=0 Feb 19 09:48:08 crc kubenswrapper[4873]: I0219 09:48:08.121604 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnf24" event={"ID":"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3","Type":"ContainerDied","Data":"1479a6032ed4c9b61367740ab02f5b5f34a581176449601b359feaf96fa47f0a"} Feb 19 09:48:08 crc kubenswrapper[4873]: I0219 09:48:08.126271 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzcdv" event={"ID":"d152d3c6-e3c6-4255-95b5-eafe02557eb9","Type":"ContainerStarted","Data":"2b6ac58d0b390cd493f6fbf85d5fe8746c584e84c39ef88c4e2c4eff614d5cf2"} Feb 19 09:48:08 crc kubenswrapper[4873]: I0219 09:48:08.128851 4873 generic.go:334] "Generic (PLEG): container finished" podID="e52516d8-c410-4dbd-b41f-cbda11425b0e" containerID="502bbe7da078201a1c59434c07360963e8b5efd274f7d66baccf688c7b0233a8" exitCode=0 Feb 19 09:48:08 crc kubenswrapper[4873]: I0219 09:48:08.128909 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fj2x" event={"ID":"e52516d8-c410-4dbd-b41f-cbda11425b0e","Type":"ContainerDied","Data":"502bbe7da078201a1c59434c07360963e8b5efd274f7d66baccf688c7b0233a8"} Feb 19 09:48:08 crc kubenswrapper[4873]: I0219 09:48:08.131990 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dcd45a6e-fa80-4995-bab8-20796784d618","Type":"ContainerStarted","Data":"12e07b634f8034e56f9833d14110782d34f2365b31aa7149ce239d933850da51"} Feb 19 09:48:08 crc kubenswrapper[4873]: I0219 09:48:08.132020 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dcd45a6e-fa80-4995-bab8-20796784d618","Type":"ContainerStarted","Data":"3520c7c7b57ec4d387d5a52af5a8868db2183fc59373efaee14a1f7ca6894456"} Feb 19 09:48:08 crc kubenswrapper[4873]: I0219 09:48:08.162163 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.162144824 podStartE2EDuration="2.162144824s" podCreationTimestamp="2026-02-19 09:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:48:08.161277452 +0000 UTC m=+197.450709100" watchObservedRunningTime="2026-02-19 09:48:08.162144824 +0000 UTC m=+197.451576462" Feb 19 09:48:08 crc kubenswrapper[4873]: I0219 09:48:08.203715 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dzcdv" podStartSLOduration=2.807075704 podStartE2EDuration="47.2036986s" podCreationTimestamp="2026-02-19 09:47:21 +0000 UTC" firstStartedPulling="2026-02-19 09:47:23.108926718 +0000 UTC m=+152.398358356" lastFinishedPulling="2026-02-19 09:48:07.505549624 +0000 UTC m=+196.794981252" observedRunningTime="2026-02-19 09:48:08.200220362 +0000 UTC m=+197.489652020" watchObservedRunningTime="2026-02-19 09:48:08.2036986 +0000 UTC m=+197.493130238" Feb 19 09:48:09 crc kubenswrapper[4873]: I0219 09:48:09.141810 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fj2x" event={"ID":"e52516d8-c410-4dbd-b41f-cbda11425b0e","Type":"ContainerStarted","Data":"c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3"} Feb 19 09:48:09 crc kubenswrapper[4873]: I0219 09:48:09.144558 4873 generic.go:334] "Generic (PLEG): container finished" podID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerID="42afdb85925fa3ea042d23acf3f719be18000b496cfd0c75215046ba3925179c" exitCode=0 Feb 19 09:48:09 crc kubenswrapper[4873]: I0219 09:48:09.144639 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jgk6" event={"ID":"e767e90e-5146-4f1e-9f0b-5f5acb185429","Type":"ContainerDied","Data":"42afdb85925fa3ea042d23acf3f719be18000b496cfd0c75215046ba3925179c"} Feb 19 09:48:09 crc kubenswrapper[4873]: I0219 09:48:09.147140 4873 generic.go:334] "Generic (PLEG): container finished" podID="d5d58373-fe5d-4afe-9da1-256843164ff4" containerID="45818128c5d2d6d5769ffe637ae6ad9e378ad3fe2558a9d3cca64ab9d5a6861c" exitCode=0 Feb 19 09:48:09 crc kubenswrapper[4873]: I0219 09:48:09.147216 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jm66x" event={"ID":"d5d58373-fe5d-4afe-9da1-256843164ff4","Type":"ContainerDied","Data":"45818128c5d2d6d5769ffe637ae6ad9e378ad3fe2558a9d3cca64ab9d5a6861c"} Feb 19 09:48:09 crc kubenswrapper[4873]: I0219 09:48:09.149371 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnf24" event={"ID":"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3","Type":"ContainerStarted","Data":"65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf"} Feb 19 09:48:09 crc kubenswrapper[4873]: I0219 09:48:09.167320 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5fj2x" podStartSLOduration=3.623359696 podStartE2EDuration="51.167287376s" podCreationTimestamp="2026-02-19 09:47:18 +0000 UTC" firstStartedPulling="2026-02-19 09:47:20.989320456 +0000 UTC m=+150.278752094" lastFinishedPulling="2026-02-19 09:48:08.533248126 +0000 UTC m=+197.822679774" observedRunningTime="2026-02-19 09:48:09.166274791 +0000 UTC m=+198.455706439" watchObservedRunningTime="2026-02-19 09:48:09.167287376 +0000 UTC m=+198.456719034" Feb 19 09:48:09 crc kubenswrapper[4873]: I0219 09:48:09.232660 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tnf24" podStartSLOduration=3.733666297 podStartE2EDuration="51.232642302s" podCreationTimestamp="2026-02-19 09:47:18 +0000 UTC" firstStartedPulling="2026-02-19 09:47:21.021089039 +0000 UTC m=+150.310520677" lastFinishedPulling="2026-02-19 09:48:08.520065044 +0000 UTC m=+197.809496682" observedRunningTime="2026-02-19 09:48:09.232037477 +0000 UTC m=+198.521469155" watchObservedRunningTime="2026-02-19 09:48:09.232642302 +0000 UTC m=+198.522073940" Feb 19 09:48:10 crc kubenswrapper[4873]: I0219 09:48:10.159030 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jgk6" event={"ID":"e767e90e-5146-4f1e-9f0b-5f5acb185429","Type":"ContainerStarted","Data":"ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a"} Feb 19 09:48:10 crc kubenswrapper[4873]: I0219 09:48:10.160722 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jm66x" event={"ID":"d5d58373-fe5d-4afe-9da1-256843164ff4","Type":"ContainerStarted","Data":"f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b"} Feb 19 09:48:10 crc kubenswrapper[4873]: I0219 09:48:10.184911 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2jgk6" podStartSLOduration=2.644960723 podStartE2EDuration="50.184896554s" podCreationTimestamp="2026-02-19 09:47:20 +0000 UTC" firstStartedPulling="2026-02-19 09:47:22.034201188 +0000 UTC m=+151.323632826" lastFinishedPulling="2026-02-19 09:48:09.574137019 +0000 UTC m=+198.863568657" observedRunningTime="2026-02-19 09:48:10.184880534 +0000 UTC m=+199.474312162" watchObservedRunningTime="2026-02-19 09:48:10.184896554 +0000 UTC m=+199.474328192" Feb 19 09:48:10 crc kubenswrapper[4873]: I0219 09:48:10.200703 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jm66x" podStartSLOduration=3.626930752 podStartE2EDuration="53.200683752s" podCreationTimestamp="2026-02-19 09:47:17 +0000 UTC" firstStartedPulling="2026-02-19 09:47:19.97421393 +0000 UTC m=+149.263645568" lastFinishedPulling="2026-02-19 09:48:09.54796693 +0000 UTC m=+198.837398568" observedRunningTime="2026-02-19 09:48:10.198503837 +0000 UTC m=+199.487935505" watchObservedRunningTime="2026-02-19 09:48:10.200683752 +0000 UTC m=+199.490115410" Feb 19 09:48:11 crc kubenswrapper[4873]: I0219 09:48:11.228118 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:48:11 crc kubenswrapper[4873]: I0219 09:48:11.228170 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:48:11 crc kubenswrapper[4873]: I0219 09:48:11.869725 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:48:11 crc kubenswrapper[4873]: I0219 09:48:11.869789 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:48:12 crc kubenswrapper[4873]: I0219 09:48:12.277613 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-2jgk6" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerName="registry-server" probeResult="failure" output=< Feb 19 09:48:12 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 09:48:12 crc kubenswrapper[4873]: > Feb 19 09:48:12 crc kubenswrapper[4873]: I0219 09:48:12.907884 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dzcdv" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerName="registry-server" probeResult="failure" output=< Feb 19 09:48:12 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 09:48:12 crc kubenswrapper[4873]: > Feb 19 09:48:16 crc kubenswrapper[4873]: I0219 09:48:16.616539 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" podUID="9324aa8b-fbce-42bb-b339-0aa2e382efd4" containerName="oauth-openshift" containerID="cri-o://97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757" gracePeriod=15 Feb 19 09:48:16 crc kubenswrapper[4873]: I0219 09:48:16.995995 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.029689 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-9565f95f5-7q58w"] Feb 19 09:48:17 crc kubenswrapper[4873]: E0219 09:48:17.029923 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9324aa8b-fbce-42bb-b339-0aa2e382efd4" containerName="oauth-openshift" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.029938 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9324aa8b-fbce-42bb-b339-0aa2e382efd4" containerName="oauth-openshift" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.030058 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="9324aa8b-fbce-42bb-b339-0aa2e382efd4" containerName="oauth-openshift" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.030509 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.046191 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9565f95f5-7q58w"] Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.163177 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-service-ca\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.163219 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpxsx\" (UniqueName: \"kubernetes.io/projected/9324aa8b-fbce-42bb-b339-0aa2e382efd4-kube-api-access-fpxsx\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.163250 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-login\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.163282 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-provider-selection\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.163332 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-cliconfig\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.163358 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-serving-cert\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.163379 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-ocp-branding-template\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.163398 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-dir\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.163553 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164124 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164249 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164288 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-idp-0-file-data\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164344 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-trusted-ca-bundle\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164372 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-policies\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164398 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-session\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164439 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-router-certs\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164468 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-error\") pod \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\" (UID: \"9324aa8b-fbce-42bb-b339-0aa2e382efd4\") " Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164631 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164663 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-router-certs\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164672 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164684 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82e487b8-1a99-4f7a-902a-049dcbaa2715-audit-dir\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164720 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164747 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164782 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-audit-policies\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164804 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164880 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164904 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-session\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164949 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-template-error\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164969 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-service-ca\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.164991 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-template-login\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.165037 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.165072 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pz26\" (UniqueName: \"kubernetes.io/projected/82e487b8-1a99-4f7a-902a-049dcbaa2715-kube-api-access-2pz26\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.165159 4873 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.165174 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.165188 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.165200 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.166754 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.174467 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.174851 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.175705 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9324aa8b-fbce-42bb-b339-0aa2e382efd4-kube-api-access-fpxsx" (OuterVolumeSpecName: "kube-api-access-fpxsx") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "kube-api-access-fpxsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.176521 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.179998 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.180211 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.182324 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.182548 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.183019 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9324aa8b-fbce-42bb-b339-0aa2e382efd4" (UID: "9324aa8b-fbce-42bb-b339-0aa2e382efd4"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.197948 4873 generic.go:334] "Generic (PLEG): container finished" podID="9324aa8b-fbce-42bb-b339-0aa2e382efd4" containerID="97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757" exitCode=0 Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.197994 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" event={"ID":"9324aa8b-fbce-42bb-b339-0aa2e382efd4","Type":"ContainerDied","Data":"97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757"} Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.198015 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.198032 4873 scope.go:117] "RemoveContainer" containerID="97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.198021 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4g545" event={"ID":"9324aa8b-fbce-42bb-b339-0aa2e382efd4","Type":"ContainerDied","Data":"b4cafb3addf61abe3b1441fa50a8321f11c79cf993ea43c1a09c9c8ca90fbdfc"} Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.229903 4873 scope.go:117] "RemoveContainer" containerID="97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757" Feb 19 09:48:17 crc kubenswrapper[4873]: E0219 09:48:17.230529 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757\": container with ID starting with 97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757 not found: ID does not exist" containerID="97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.230579 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757"} err="failed to get container status \"97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757\": rpc error: code = NotFound desc = could not find container \"97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757\": container with ID starting with 97275dc87160b7aaab21e362565d26836620eb22c875e626afdba41f45da3757 not found: ID does not exist" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.234036 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g545"] Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.238540 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g545"] Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266081 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266158 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266184 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-audit-policies\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266207 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266242 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266262 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-session\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266280 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-service-ca\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266296 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-template-error\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266311 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-template-login\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266341 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266362 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pz26\" (UniqueName: \"kubernetes.io/projected/82e487b8-1a99-4f7a-902a-049dcbaa2715-kube-api-access-2pz26\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266395 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266410 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-router-certs\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266426 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82e487b8-1a99-4f7a-902a-049dcbaa2715-audit-dir\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266464 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266475 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266485 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266496 4873 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9324aa8b-fbce-42bb-b339-0aa2e382efd4-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266504 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266517 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266528 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266539 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpxsx\" (UniqueName: \"kubernetes.io/projected/9324aa8b-fbce-42bb-b339-0aa2e382efd4-kube-api-access-fpxsx\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266550 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266560 4873 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9324aa8b-fbce-42bb-b339-0aa2e382efd4-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.266601 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82e487b8-1a99-4f7a-902a-049dcbaa2715-audit-dir\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.267257 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-service-ca\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.267632 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-audit-policies\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.267938 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.268578 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.271599 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-template-error\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.271628 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.271638 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.272051 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.272184 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.272230 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-router-certs\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.272388 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-user-template-login\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.274268 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82e487b8-1a99-4f7a-902a-049dcbaa2715-v4-0-config-system-session\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.285568 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pz26\" (UniqueName: \"kubernetes.io/projected/82e487b8-1a99-4f7a-902a-049dcbaa2715-kube-api-access-2pz26\") pod \"oauth-openshift-9565f95f5-7q58w\" (UID: \"82e487b8-1a99-4f7a-902a-049dcbaa2715\") " pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.345348 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.493681 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9324aa8b-fbce-42bb-b339-0aa2e382efd4" path="/var/lib/kubelet/pods/9324aa8b-fbce-42bb-b339-0aa2e382efd4/volumes" Feb 19 09:48:17 crc kubenswrapper[4873]: I0219 09:48:17.761503 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9565f95f5-7q58w"] Feb 19 09:48:17 crc kubenswrapper[4873]: W0219 09:48:17.782542 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82e487b8_1a99_4f7a_902a_049dcbaa2715.slice/crio-7c15b3027e2e58501d78526d48ad77fd4f3d49eb9619d286717e8cb4bb237de7 WatchSource:0}: Error finding container 7c15b3027e2e58501d78526d48ad77fd4f3d49eb9619d286717e8cb4bb237de7: Status 404 returned error can't find the container with id 7c15b3027e2e58501d78526d48ad77fd4f3d49eb9619d286717e8cb4bb237de7 Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.214322 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" event={"ID":"82e487b8-1a99-4f7a-902a-049dcbaa2715","Type":"ContainerStarted","Data":"7c15b3027e2e58501d78526d48ad77fd4f3d49eb9619d286717e8cb4bb237de7"} Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.241424 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.241502 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.241594 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.242462 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.242565 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837" gracePeriod=600 Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.437396 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.437455 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.516211 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.518934 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.527825 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.575090 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.931753 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.932165 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:48:18 crc kubenswrapper[4873]: I0219 09:48:18.991822 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:48:19 crc kubenswrapper[4873]: I0219 09:48:19.221870 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837" exitCode=0 Feb 19 09:48:19 crc kubenswrapper[4873]: I0219 09:48:19.221958 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837"} Feb 19 09:48:19 crc kubenswrapper[4873]: I0219 09:48:19.222016 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"5ecc9e74f65542c5ba1361ec123b0a6a0ddd50ca3d18c190393ca23d1531b88a"} Feb 19 09:48:19 crc kubenswrapper[4873]: I0219 09:48:19.223463 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" event={"ID":"82e487b8-1a99-4f7a-902a-049dcbaa2715","Type":"ContainerStarted","Data":"8241b3ce1c6b5b4f0d12db7396350203e280fb8c3078015365e0b62e853f2b61"} Feb 19 09:48:19 crc kubenswrapper[4873]: I0219 09:48:19.265697 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" podStartSLOduration=28.265673935 podStartE2EDuration="28.265673935s" podCreationTimestamp="2026-02-19 09:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:48:19.263341116 +0000 UTC m=+208.552772774" watchObservedRunningTime="2026-02-19 09:48:19.265673935 +0000 UTC m=+208.555105593" Feb 19 09:48:19 crc kubenswrapper[4873]: I0219 09:48:19.271909 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:48:19 crc kubenswrapper[4873]: I0219 09:48:19.273132 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:48:19 crc kubenswrapper[4873]: I0219 09:48:19.278232 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:48:20 crc kubenswrapper[4873]: I0219 09:48:20.229829 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:20 crc kubenswrapper[4873]: I0219 09:48:20.236055 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9565f95f5-7q58w" Feb 19 09:48:21 crc kubenswrapper[4873]: I0219 09:48:21.231823 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5fj2x"] Feb 19 09:48:21 crc kubenswrapper[4873]: I0219 09:48:21.299786 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:48:21 crc kubenswrapper[4873]: I0219 09:48:21.357116 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:48:21 crc kubenswrapper[4873]: I0219 09:48:21.921235 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:48:21 crc kubenswrapper[4873]: I0219 09:48:21.972332 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:48:22 crc kubenswrapper[4873]: I0219 09:48:22.244259 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5fj2x" podUID="e52516d8-c410-4dbd-b41f-cbda11425b0e" containerName="registry-server" containerID="cri-o://c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3" gracePeriod=2 Feb 19 09:48:22 crc kubenswrapper[4873]: I0219 09:48:22.708052 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:48:22 crc kubenswrapper[4873]: I0219 09:48:22.874607 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-utilities\") pod \"e52516d8-c410-4dbd-b41f-cbda11425b0e\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " Feb 19 09:48:22 crc kubenswrapper[4873]: I0219 09:48:22.874645 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc4qh\" (UniqueName: \"kubernetes.io/projected/e52516d8-c410-4dbd-b41f-cbda11425b0e-kube-api-access-hc4qh\") pod \"e52516d8-c410-4dbd-b41f-cbda11425b0e\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " Feb 19 09:48:22 crc kubenswrapper[4873]: I0219 09:48:22.874675 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-catalog-content\") pod \"e52516d8-c410-4dbd-b41f-cbda11425b0e\" (UID: \"e52516d8-c410-4dbd-b41f-cbda11425b0e\") " Feb 19 09:48:22 crc kubenswrapper[4873]: I0219 09:48:22.875492 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-utilities" (OuterVolumeSpecName: "utilities") pod "e52516d8-c410-4dbd-b41f-cbda11425b0e" (UID: "e52516d8-c410-4dbd-b41f-cbda11425b0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:22 crc kubenswrapper[4873]: I0219 09:48:22.879680 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e52516d8-c410-4dbd-b41f-cbda11425b0e-kube-api-access-hc4qh" (OuterVolumeSpecName: "kube-api-access-hc4qh") pod "e52516d8-c410-4dbd-b41f-cbda11425b0e" (UID: "e52516d8-c410-4dbd-b41f-cbda11425b0e"). InnerVolumeSpecName "kube-api-access-hc4qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:22 crc kubenswrapper[4873]: I0219 09:48:22.942678 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e52516d8-c410-4dbd-b41f-cbda11425b0e" (UID: "e52516d8-c410-4dbd-b41f-cbda11425b0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:22 crc kubenswrapper[4873]: I0219 09:48:22.977174 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:22 crc kubenswrapper[4873]: I0219 09:48:22.977234 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc4qh\" (UniqueName: \"kubernetes.io/projected/e52516d8-c410-4dbd-b41f-cbda11425b0e-kube-api-access-hc4qh\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:22 crc kubenswrapper[4873]: I0219 09:48:22.977256 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e52516d8-c410-4dbd-b41f-cbda11425b0e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.251318 4873 generic.go:334] "Generic (PLEG): container finished" podID="e52516d8-c410-4dbd-b41f-cbda11425b0e" containerID="c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3" exitCode=0 Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.251370 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5fj2x" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.251391 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fj2x" event={"ID":"e52516d8-c410-4dbd-b41f-cbda11425b0e","Type":"ContainerDied","Data":"c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3"} Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.251911 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fj2x" event={"ID":"e52516d8-c410-4dbd-b41f-cbda11425b0e","Type":"ContainerDied","Data":"3d229a1d7483ee232f5190406e28ea1aa38e3259959252fbb620deb657e8a447"} Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.251957 4873 scope.go:117] "RemoveContainer" containerID="c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.281023 4873 scope.go:117] "RemoveContainer" containerID="502bbe7da078201a1c59434c07360963e8b5efd274f7d66baccf688c7b0233a8" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.282738 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5fj2x"] Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.288336 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5fj2x"] Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.319808 4873 scope.go:117] "RemoveContainer" containerID="359abec79bd5f24ab125135475868fe55a53d89f41fe8e7574eec6dce9ec9eda" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.346049 4873 scope.go:117] "RemoveContainer" containerID="c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3" Feb 19 09:48:23 crc kubenswrapper[4873]: E0219 09:48:23.346529 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3\": container with ID starting with c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3 not found: ID does not exist" containerID="c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.346567 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3"} err="failed to get container status \"c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3\": rpc error: code = NotFound desc = could not find container \"c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3\": container with ID starting with c5a7b8fed896e37da1461b5e8c5138425ba8fee5d6a0a01037ce006e2bb0e0d3 not found: ID does not exist" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.346592 4873 scope.go:117] "RemoveContainer" containerID="502bbe7da078201a1c59434c07360963e8b5efd274f7d66baccf688c7b0233a8" Feb 19 09:48:23 crc kubenswrapper[4873]: E0219 09:48:23.347002 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"502bbe7da078201a1c59434c07360963e8b5efd274f7d66baccf688c7b0233a8\": container with ID starting with 502bbe7da078201a1c59434c07360963e8b5efd274f7d66baccf688c7b0233a8 not found: ID does not exist" containerID="502bbe7da078201a1c59434c07360963e8b5efd274f7d66baccf688c7b0233a8" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.347029 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"502bbe7da078201a1c59434c07360963e8b5efd274f7d66baccf688c7b0233a8"} err="failed to get container status \"502bbe7da078201a1c59434c07360963e8b5efd274f7d66baccf688c7b0233a8\": rpc error: code = NotFound desc = could not find container \"502bbe7da078201a1c59434c07360963e8b5efd274f7d66baccf688c7b0233a8\": container with ID starting with 502bbe7da078201a1c59434c07360963e8b5efd274f7d66baccf688c7b0233a8 not found: ID does not exist" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.347046 4873 scope.go:117] "RemoveContainer" containerID="359abec79bd5f24ab125135475868fe55a53d89f41fe8e7574eec6dce9ec9eda" Feb 19 09:48:23 crc kubenswrapper[4873]: E0219 09:48:23.349548 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"359abec79bd5f24ab125135475868fe55a53d89f41fe8e7574eec6dce9ec9eda\": container with ID starting with 359abec79bd5f24ab125135475868fe55a53d89f41fe8e7574eec6dce9ec9eda not found: ID does not exist" containerID="359abec79bd5f24ab125135475868fe55a53d89f41fe8e7574eec6dce9ec9eda" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.349591 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359abec79bd5f24ab125135475868fe55a53d89f41fe8e7574eec6dce9ec9eda"} err="failed to get container status \"359abec79bd5f24ab125135475868fe55a53d89f41fe8e7574eec6dce9ec9eda\": rpc error: code = NotFound desc = could not find container \"359abec79bd5f24ab125135475868fe55a53d89f41fe8e7574eec6dce9ec9eda\": container with ID starting with 359abec79bd5f24ab125135475868fe55a53d89f41fe8e7574eec6dce9ec9eda not found: ID does not exist" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.489832 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e52516d8-c410-4dbd-b41f-cbda11425b0e" path="/var/lib/kubelet/pods/e52516d8-c410-4dbd-b41f-cbda11425b0e/volumes" Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.632466 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jgk6"] Feb 19 09:48:23 crc kubenswrapper[4873]: I0219 09:48:23.632797 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2jgk6" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerName="registry-server" containerID="cri-o://ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a" gracePeriod=2 Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.071335 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.191174 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm9f6\" (UniqueName: \"kubernetes.io/projected/e767e90e-5146-4f1e-9f0b-5f5acb185429-kube-api-access-vm9f6\") pod \"e767e90e-5146-4f1e-9f0b-5f5acb185429\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.191302 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-utilities\") pod \"e767e90e-5146-4f1e-9f0b-5f5acb185429\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.191349 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-catalog-content\") pod \"e767e90e-5146-4f1e-9f0b-5f5acb185429\" (UID: \"e767e90e-5146-4f1e-9f0b-5f5acb185429\") " Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.192432 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-utilities" (OuterVolumeSpecName: "utilities") pod "e767e90e-5146-4f1e-9f0b-5f5acb185429" (UID: "e767e90e-5146-4f1e-9f0b-5f5acb185429"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.196308 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e767e90e-5146-4f1e-9f0b-5f5acb185429-kube-api-access-vm9f6" (OuterVolumeSpecName: "kube-api-access-vm9f6") pod "e767e90e-5146-4f1e-9f0b-5f5acb185429" (UID: "e767e90e-5146-4f1e-9f0b-5f5acb185429"). InnerVolumeSpecName "kube-api-access-vm9f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.220948 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e767e90e-5146-4f1e-9f0b-5f5acb185429" (UID: "e767e90e-5146-4f1e-9f0b-5f5acb185429"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.260187 4873 generic.go:334] "Generic (PLEG): container finished" podID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerID="ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a" exitCode=0 Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.260273 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jgk6" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.260232 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jgk6" event={"ID":"e767e90e-5146-4f1e-9f0b-5f5acb185429","Type":"ContainerDied","Data":"ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a"} Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.260324 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jgk6" event={"ID":"e767e90e-5146-4f1e-9f0b-5f5acb185429","Type":"ContainerDied","Data":"2314d422015de89852637bb24195891fe7f3f4631802a8cc5426a4f84f3df229"} Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.260346 4873 scope.go:117] "RemoveContainer" containerID="ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.281498 4873 scope.go:117] "RemoveContainer" containerID="42afdb85925fa3ea042d23acf3f719be18000b496cfd0c75215046ba3925179c" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.292931 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm9f6\" (UniqueName: \"kubernetes.io/projected/e767e90e-5146-4f1e-9f0b-5f5acb185429-kube-api-access-vm9f6\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.292966 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.292981 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e767e90e-5146-4f1e-9f0b-5f5acb185429-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.306642 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jgk6"] Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.309544 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jgk6"] Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.310887 4873 scope.go:117] "RemoveContainer" containerID="3d19858345e60805d3d74034c65385dcd27dffb04874a42aed209e62e4d2c8a6" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.332418 4873 scope.go:117] "RemoveContainer" containerID="ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a" Feb 19 09:48:24 crc kubenswrapper[4873]: E0219 09:48:24.332913 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a\": container with ID starting with ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a not found: ID does not exist" containerID="ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.332939 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a"} err="failed to get container status \"ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a\": rpc error: code = NotFound desc = could not find container \"ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a\": container with ID starting with ed752748d14b1d34c7402f5f46433b310ca9c0b693dca983a096ae322ca1a06a not found: ID does not exist" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.332958 4873 scope.go:117] "RemoveContainer" containerID="42afdb85925fa3ea042d23acf3f719be18000b496cfd0c75215046ba3925179c" Feb 19 09:48:24 crc kubenswrapper[4873]: E0219 09:48:24.333242 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42afdb85925fa3ea042d23acf3f719be18000b496cfd0c75215046ba3925179c\": container with ID starting with 42afdb85925fa3ea042d23acf3f719be18000b496cfd0c75215046ba3925179c not found: ID does not exist" containerID="42afdb85925fa3ea042d23acf3f719be18000b496cfd0c75215046ba3925179c" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.333395 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42afdb85925fa3ea042d23acf3f719be18000b496cfd0c75215046ba3925179c"} err="failed to get container status \"42afdb85925fa3ea042d23acf3f719be18000b496cfd0c75215046ba3925179c\": rpc error: code = NotFound desc = could not find container \"42afdb85925fa3ea042d23acf3f719be18000b496cfd0c75215046ba3925179c\": container with ID starting with 42afdb85925fa3ea042d23acf3f719be18000b496cfd0c75215046ba3925179c not found: ID does not exist" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.333481 4873 scope.go:117] "RemoveContainer" containerID="3d19858345e60805d3d74034c65385dcd27dffb04874a42aed209e62e4d2c8a6" Feb 19 09:48:24 crc kubenswrapper[4873]: E0219 09:48:24.333876 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d19858345e60805d3d74034c65385dcd27dffb04874a42aed209e62e4d2c8a6\": container with ID starting with 3d19858345e60805d3d74034c65385dcd27dffb04874a42aed209e62e4d2c8a6 not found: ID does not exist" containerID="3d19858345e60805d3d74034c65385dcd27dffb04874a42aed209e62e4d2c8a6" Feb 19 09:48:24 crc kubenswrapper[4873]: I0219 09:48:24.333926 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d19858345e60805d3d74034c65385dcd27dffb04874a42aed209e62e4d2c8a6"} err="failed to get container status \"3d19858345e60805d3d74034c65385dcd27dffb04874a42aed209e62e4d2c8a6\": rpc error: code = NotFound desc = could not find container \"3d19858345e60805d3d74034c65385dcd27dffb04874a42aed209e62e4d2c8a6\": container with ID starting with 3d19858345e60805d3d74034c65385dcd27dffb04874a42aed209e62e4d2c8a6 not found: ID does not exist" Feb 19 09:48:25 crc kubenswrapper[4873]: I0219 09:48:25.498682 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" path="/var/lib/kubelet/pods/e767e90e-5146-4f1e-9f0b-5f5acb185429/volumes" Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.027051 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzcdv"] Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.027348 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dzcdv" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerName="registry-server" containerID="cri-o://2b6ac58d0b390cd493f6fbf85d5fe8746c584e84c39ef88c4e2c4eff614d5cf2" gracePeriod=2 Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.278449 4873 generic.go:334] "Generic (PLEG): container finished" podID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerID="2b6ac58d0b390cd493f6fbf85d5fe8746c584e84c39ef88c4e2c4eff614d5cf2" exitCode=0 Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.278505 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzcdv" event={"ID":"d152d3c6-e3c6-4255-95b5-eafe02557eb9","Type":"ContainerDied","Data":"2b6ac58d0b390cd493f6fbf85d5fe8746c584e84c39ef88c4e2c4eff614d5cf2"} Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.504677 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.620673 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-catalog-content\") pod \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.620811 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zms24\" (UniqueName: \"kubernetes.io/projected/d152d3c6-e3c6-4255-95b5-eafe02557eb9-kube-api-access-zms24\") pod \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.620892 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-utilities\") pod \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\" (UID: \"d152d3c6-e3c6-4255-95b5-eafe02557eb9\") " Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.622298 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-utilities" (OuterVolumeSpecName: "utilities") pod "d152d3c6-e3c6-4255-95b5-eafe02557eb9" (UID: "d152d3c6-e3c6-4255-95b5-eafe02557eb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.629639 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d152d3c6-e3c6-4255-95b5-eafe02557eb9-kube-api-access-zms24" (OuterVolumeSpecName: "kube-api-access-zms24") pod "d152d3c6-e3c6-4255-95b5-eafe02557eb9" (UID: "d152d3c6-e3c6-4255-95b5-eafe02557eb9"). InnerVolumeSpecName "kube-api-access-zms24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.722957 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zms24\" (UniqueName: \"kubernetes.io/projected/d152d3c6-e3c6-4255-95b5-eafe02557eb9-kube-api-access-zms24\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.723011 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.773085 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d152d3c6-e3c6-4255-95b5-eafe02557eb9" (UID: "d152d3c6-e3c6-4255-95b5-eafe02557eb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:26 crc kubenswrapper[4873]: I0219 09:48:26.824556 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d152d3c6-e3c6-4255-95b5-eafe02557eb9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:27 crc kubenswrapper[4873]: I0219 09:48:27.286325 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzcdv" event={"ID":"d152d3c6-e3c6-4255-95b5-eafe02557eb9","Type":"ContainerDied","Data":"81340de24ca383dbb41a0340acf197019d868e5563832a3950dc50b33c15f087"} Feb 19 09:48:27 crc kubenswrapper[4873]: I0219 09:48:27.286395 4873 scope.go:117] "RemoveContainer" containerID="2b6ac58d0b390cd493f6fbf85d5fe8746c584e84c39ef88c4e2c4eff614d5cf2" Feb 19 09:48:27 crc kubenswrapper[4873]: I0219 09:48:27.286427 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzcdv" Feb 19 09:48:27 crc kubenswrapper[4873]: I0219 09:48:27.312244 4873 scope.go:117] "RemoveContainer" containerID="3925b9cd7df38893cf6f1abc778ceaaf22660b5582a02ffd58d2352d46ffbced" Feb 19 09:48:27 crc kubenswrapper[4873]: I0219 09:48:27.331092 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzcdv"] Feb 19 09:48:27 crc kubenswrapper[4873]: I0219 09:48:27.334454 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dzcdv"] Feb 19 09:48:27 crc kubenswrapper[4873]: I0219 09:48:27.348582 4873 scope.go:117] "RemoveContainer" containerID="75a546ab60f91886bf73906724d9833647cf46b858664cd39c852a73088064e8" Feb 19 09:48:27 crc kubenswrapper[4873]: I0219 09:48:27.493416 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" path="/var/lib/kubelet/pods/d152d3c6-e3c6-4255-95b5-eafe02557eb9/volumes" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.413858 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tnf24"] Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.414705 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tnf24" podUID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" containerName="registry-server" containerID="cri-o://65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf" gracePeriod=30 Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.440323 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jm66x"] Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.440675 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jm66x" podUID="d5d58373-fe5d-4afe-9da1-256843164ff4" containerName="registry-server" containerID="cri-o://f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b" gracePeriod=30 Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.449732 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-86hhq"] Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.450075 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" podUID="f905b5ea-71df-4b1c-997c-d68766bcfcfe" containerName="marketplace-operator" containerID="cri-o://1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026" gracePeriod=30 Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.454055 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hv2j6"] Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.454363 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hv2j6" podUID="0954690a-09f0-4b1b-be57-db87e9304488" containerName="registry-server" containerID="cri-o://74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba" gracePeriod=30 Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.454846 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jt9rj"] Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.455077 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerName="extract-content" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455113 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerName="extract-content" Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.455128 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerName="extract-content" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455138 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerName="extract-content" Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.455149 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerName="registry-server" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455157 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerName="registry-server" Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.455172 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerName="registry-server" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455180 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerName="registry-server" Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.455190 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52516d8-c410-4dbd-b41f-cbda11425b0e" containerName="extract-content" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455198 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52516d8-c410-4dbd-b41f-cbda11425b0e" containerName="extract-content" Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.455209 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52516d8-c410-4dbd-b41f-cbda11425b0e" containerName="registry-server" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455217 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52516d8-c410-4dbd-b41f-cbda11425b0e" containerName="registry-server" Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.455231 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerName="extract-utilities" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455240 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerName="extract-utilities" Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.455259 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e52516d8-c410-4dbd-b41f-cbda11425b0e" containerName="extract-utilities" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455267 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e52516d8-c410-4dbd-b41f-cbda11425b0e" containerName="extract-utilities" Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.455279 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerName="extract-utilities" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455287 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerName="extract-utilities" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455399 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e767e90e-5146-4f1e-9f0b-5f5acb185429" containerName="registry-server" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455414 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d152d3c6-e3c6-4255-95b5-eafe02557eb9" containerName="registry-server" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455427 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e52516d8-c410-4dbd-b41f-cbda11425b0e" containerName="registry-server" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.455855 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.457865 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjn8l"] Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.458454 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gjn8l" podUID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerName="registry-server" containerID="cri-o://f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0" gracePeriod=30 Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.462933 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jt9rj"] Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.636685 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1d58439b-31c6-44df-a32d-48f0fcb6a361-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jt9rj\" (UID: \"1d58439b-31c6-44df-a32d-48f0fcb6a361\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.636854 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vqs4\" (UniqueName: \"kubernetes.io/projected/1d58439b-31c6-44df-a32d-48f0fcb6a361-kube-api-access-2vqs4\") pod \"marketplace-operator-79b997595-jt9rj\" (UID: \"1d58439b-31c6-44df-a32d-48f0fcb6a361\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.637037 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d58439b-31c6-44df-a32d-48f0fcb6a361-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jt9rj\" (UID: \"1d58439b-31c6-44df-a32d-48f0fcb6a361\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.671723 4873 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0 is running failed: container process not found" containerID="f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.672156 4873 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0 is running failed: container process not found" containerID="f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.672515 4873 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0 is running failed: container process not found" containerID="f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 09:48:41 crc kubenswrapper[4873]: E0219 09:48:41.672543 4873 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-gjn8l" podUID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerName="registry-server" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.738196 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vqs4\" (UniqueName: \"kubernetes.io/projected/1d58439b-31c6-44df-a32d-48f0fcb6a361-kube-api-access-2vqs4\") pod \"marketplace-operator-79b997595-jt9rj\" (UID: \"1d58439b-31c6-44df-a32d-48f0fcb6a361\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.738285 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d58439b-31c6-44df-a32d-48f0fcb6a361-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jt9rj\" (UID: \"1d58439b-31c6-44df-a32d-48f0fcb6a361\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.738325 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1d58439b-31c6-44df-a32d-48f0fcb6a361-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jt9rj\" (UID: \"1d58439b-31c6-44df-a32d-48f0fcb6a361\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.739894 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d58439b-31c6-44df-a32d-48f0fcb6a361-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jt9rj\" (UID: \"1d58439b-31c6-44df-a32d-48f0fcb6a361\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.747050 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1d58439b-31c6-44df-a32d-48f0fcb6a361-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jt9rj\" (UID: \"1d58439b-31c6-44df-a32d-48f0fcb6a361\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.762464 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vqs4\" (UniqueName: \"kubernetes.io/projected/1d58439b-31c6-44df-a32d-48f0fcb6a361-kube-api-access-2vqs4\") pod \"marketplace-operator-79b997595-jt9rj\" (UID: \"1d58439b-31c6-44df-a32d-48f0fcb6a361\") " pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.881799 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.889762 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.891071 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.901030 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:48:41 crc kubenswrapper[4873]: I0219 09:48:41.907224 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.041669 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t66n7\" (UniqueName: \"kubernetes.io/projected/f905b5ea-71df-4b1c-997c-d68766bcfcfe-kube-api-access-t66n7\") pod \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.041724 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-operator-metrics\") pod \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.041749 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-utilities\") pod \"7423538a-949c-4995-bcf8-f2b6a2f8d914\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.041772 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-utilities\") pod \"0954690a-09f0-4b1b-be57-db87e9304488\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.041792 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-trusted-ca\") pod \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\" (UID: \"f905b5ea-71df-4b1c-997c-d68766bcfcfe\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.041824 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-catalog-content\") pod \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.041842 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-catalog-content\") pod \"0954690a-09f0-4b1b-be57-db87e9304488\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.041868 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4gtj\" (UniqueName: \"kubernetes.io/projected/0954690a-09f0-4b1b-be57-db87e9304488-kube-api-access-f4gtj\") pod \"0954690a-09f0-4b1b-be57-db87e9304488\" (UID: \"0954690a-09f0-4b1b-be57-db87e9304488\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.041897 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnz6w\" (UniqueName: \"kubernetes.io/projected/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-kube-api-access-vnz6w\") pod \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.041924 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gph4z\" (UniqueName: \"kubernetes.io/projected/7423538a-949c-4995-bcf8-f2b6a2f8d914-kube-api-access-gph4z\") pod \"7423538a-949c-4995-bcf8-f2b6a2f8d914\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.041970 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-catalog-content\") pod \"7423538a-949c-4995-bcf8-f2b6a2f8d914\" (UID: \"7423538a-949c-4995-bcf8-f2b6a2f8d914\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.042007 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-utilities\") pod \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\" (UID: \"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.045876 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f905b5ea-71df-4b1c-997c-d68766bcfcfe" (UID: "f905b5ea-71df-4b1c-997c-d68766bcfcfe"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.046228 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-utilities" (OuterVolumeSpecName: "utilities") pod "0954690a-09f0-4b1b-be57-db87e9304488" (UID: "0954690a-09f0-4b1b-be57-db87e9304488"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.046496 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0954690a-09f0-4b1b-be57-db87e9304488-kube-api-access-f4gtj" (OuterVolumeSpecName: "kube-api-access-f4gtj") pod "0954690a-09f0-4b1b-be57-db87e9304488" (UID: "0954690a-09f0-4b1b-be57-db87e9304488"). InnerVolumeSpecName "kube-api-access-f4gtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.046533 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-kube-api-access-vnz6w" (OuterVolumeSpecName: "kube-api-access-vnz6w") pod "9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" (UID: "9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3"). InnerVolumeSpecName "kube-api-access-vnz6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.046799 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f905b5ea-71df-4b1c-997c-d68766bcfcfe-kube-api-access-t66n7" (OuterVolumeSpecName: "kube-api-access-t66n7") pod "f905b5ea-71df-4b1c-997c-d68766bcfcfe" (UID: "f905b5ea-71df-4b1c-997c-d68766bcfcfe"). InnerVolumeSpecName "kube-api-access-t66n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.046938 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7423538a-949c-4995-bcf8-f2b6a2f8d914-kube-api-access-gph4z" (OuterVolumeSpecName: "kube-api-access-gph4z") pod "7423538a-949c-4995-bcf8-f2b6a2f8d914" (UID: "7423538a-949c-4995-bcf8-f2b6a2f8d914"). InnerVolumeSpecName "kube-api-access-gph4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.048983 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f905b5ea-71df-4b1c-997c-d68766bcfcfe" (UID: "f905b5ea-71df-4b1c-997c-d68766bcfcfe"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.050420 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-utilities" (OuterVolumeSpecName: "utilities") pod "9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" (UID: "9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.052914 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-utilities" (OuterVolumeSpecName: "utilities") pod "7423538a-949c-4995-bcf8-f2b6a2f8d914" (UID: "7423538a-949c-4995-bcf8-f2b6a2f8d914"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.091958 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0954690a-09f0-4b1b-be57-db87e9304488" (UID: "0954690a-09f0-4b1b-be57-db87e9304488"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.119607 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" (UID: "9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.134835 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.144455 4873 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.144477 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.144488 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.144496 4873 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f905b5ea-71df-4b1c-997c-d68766bcfcfe-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.144505 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.144513 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0954690a-09f0-4b1b-be57-db87e9304488-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.144520 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4gtj\" (UniqueName: \"kubernetes.io/projected/0954690a-09f0-4b1b-be57-db87e9304488-kube-api-access-f4gtj\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.144530 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnz6w\" (UniqueName: \"kubernetes.io/projected/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-kube-api-access-vnz6w\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.144540 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gph4z\" (UniqueName: \"kubernetes.io/projected/7423538a-949c-4995-bcf8-f2b6a2f8d914-kube-api-access-gph4z\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.144548 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.144556 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t66n7\" (UniqueName: \"kubernetes.io/projected/f905b5ea-71df-4b1c-997c-d68766bcfcfe-kube-api-access-t66n7\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.206775 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7423538a-949c-4995-bcf8-f2b6a2f8d914" (UID: "7423538a-949c-4995-bcf8-f2b6a2f8d914"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.245898 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-catalog-content\") pod \"d5d58373-fe5d-4afe-9da1-256843164ff4\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.246017 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-utilities\") pod \"d5d58373-fe5d-4afe-9da1-256843164ff4\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.246062 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sl6f\" (UniqueName: \"kubernetes.io/projected/d5d58373-fe5d-4afe-9da1-256843164ff4-kube-api-access-7sl6f\") pod \"d5d58373-fe5d-4afe-9da1-256843164ff4\" (UID: \"d5d58373-fe5d-4afe-9da1-256843164ff4\") " Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.246280 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7423538a-949c-4995-bcf8-f2b6a2f8d914-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.246984 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-utilities" (OuterVolumeSpecName: "utilities") pod "d5d58373-fe5d-4afe-9da1-256843164ff4" (UID: "d5d58373-fe5d-4afe-9da1-256843164ff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.250262 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d58373-fe5d-4afe-9da1-256843164ff4-kube-api-access-7sl6f" (OuterVolumeSpecName: "kube-api-access-7sl6f") pod "d5d58373-fe5d-4afe-9da1-256843164ff4" (UID: "d5d58373-fe5d-4afe-9da1-256843164ff4"). InnerVolumeSpecName "kube-api-access-7sl6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.305518 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5d58373-fe5d-4afe-9da1-256843164ff4" (UID: "d5d58373-fe5d-4afe-9da1-256843164ff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.349013 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.349405 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5d58373-fe5d-4afe-9da1-256843164ff4-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.349426 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sl6f\" (UniqueName: \"kubernetes.io/projected/d5d58373-fe5d-4afe-9da1-256843164ff4-kube-api-access-7sl6f\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.350783 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jt9rj"] Feb 19 09:48:42 crc kubenswrapper[4873]: W0219 09:48:42.360271 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d58439b_31c6_44df_a32d_48f0fcb6a361.slice/crio-0986493cd56ce3102e31a035df4a0cedabeb9d20f536849198783f901982b1e6 WatchSource:0}: Error finding container 0986493cd56ce3102e31a035df4a0cedabeb9d20f536849198783f901982b1e6: Status 404 returned error can't find the container with id 0986493cd56ce3102e31a035df4a0cedabeb9d20f536849198783f901982b1e6 Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.367316 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" event={"ID":"1d58439b-31c6-44df-a32d-48f0fcb6a361","Type":"ContainerStarted","Data":"0986493cd56ce3102e31a035df4a0cedabeb9d20f536849198783f901982b1e6"} Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.369730 4873 generic.go:334] "Generic (PLEG): container finished" podID="d5d58373-fe5d-4afe-9da1-256843164ff4" containerID="f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b" exitCode=0 Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.369821 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jm66x" event={"ID":"d5d58373-fe5d-4afe-9da1-256843164ff4","Type":"ContainerDied","Data":"f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b"} Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.369879 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jm66x" event={"ID":"d5d58373-fe5d-4afe-9da1-256843164ff4","Type":"ContainerDied","Data":"81841dde96f0bd4c162af34dbaad80f9410851d99d1c9f71453d6997e197a658"} Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.369920 4873 scope.go:117] "RemoveContainer" containerID="f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.370066 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jm66x" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.373309 4873 generic.go:334] "Generic (PLEG): container finished" podID="0954690a-09f0-4b1b-be57-db87e9304488" containerID="74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba" exitCode=0 Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.373382 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hv2j6" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.373399 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv2j6" event={"ID":"0954690a-09f0-4b1b-be57-db87e9304488","Type":"ContainerDied","Data":"74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba"} Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.373446 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hv2j6" event={"ID":"0954690a-09f0-4b1b-be57-db87e9304488","Type":"ContainerDied","Data":"edab2539f7fc8755b323d06c9cc87b6333d411f7bbacd04da485c28f244826a3"} Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.375568 4873 generic.go:334] "Generic (PLEG): container finished" podID="f905b5ea-71df-4b1c-997c-d68766bcfcfe" containerID="1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026" exitCode=0 Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.375669 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.375686 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" event={"ID":"f905b5ea-71df-4b1c-997c-d68766bcfcfe","Type":"ContainerDied","Data":"1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026"} Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.375868 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-86hhq" event={"ID":"f905b5ea-71df-4b1c-997c-d68766bcfcfe","Type":"ContainerDied","Data":"019acffae30ee36980fd8260d8a8299738a95c80eb08007a6c0478560261a038"} Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.378836 4873 generic.go:334] "Generic (PLEG): container finished" podID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerID="f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0" exitCode=0 Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.378955 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjn8l" event={"ID":"7423538a-949c-4995-bcf8-f2b6a2f8d914","Type":"ContainerDied","Data":"f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0"} Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.379034 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjn8l" event={"ID":"7423538a-949c-4995-bcf8-f2b6a2f8d914","Type":"ContainerDied","Data":"5bca84b1a6668c5e7d3c16b7d1810bc8d1542096d34580cd77564b1a69e0e7cc"} Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.379307 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjn8l" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.383274 4873 generic.go:334] "Generic (PLEG): container finished" podID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" containerID="65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf" exitCode=0 Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.383316 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnf24" event={"ID":"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3","Type":"ContainerDied","Data":"65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf"} Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.383341 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnf24" event={"ID":"9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3","Type":"ContainerDied","Data":"49ebe6c3ea35eaecd163d7a7c155a22151d195a56ce773049fc5f4d9fdced9e7"} Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.383425 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tnf24" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.400549 4873 scope.go:117] "RemoveContainer" containerID="45818128c5d2d6d5769ffe637ae6ad9e378ad3fe2558a9d3cca64ab9d5a6861c" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.431644 4873 scope.go:117] "RemoveContainer" containerID="c2f5ae8579d6418292a6c0f3c975976d1e536a3be874c8a9279ae73eeb01983a" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.434858 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jm66x"] Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.440941 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jm66x"] Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.443456 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hv2j6"] Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.445672 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hv2j6"] Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.469261 4873 scope.go:117] "RemoveContainer" containerID="f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.469732 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b\": container with ID starting with f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b not found: ID does not exist" containerID="f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.469786 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b"} err="failed to get container status \"f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b\": rpc error: code = NotFound desc = could not find container \"f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b\": container with ID starting with f9966e12727ab7b65da136d3e9ae3ccc835716701fdc8c61d0ce85583d0d264b not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.469819 4873 scope.go:117] "RemoveContainer" containerID="45818128c5d2d6d5769ffe637ae6ad9e378ad3fe2558a9d3cca64ab9d5a6861c" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.470266 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tnf24"] Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.470381 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45818128c5d2d6d5769ffe637ae6ad9e378ad3fe2558a9d3cca64ab9d5a6861c\": container with ID starting with 45818128c5d2d6d5769ffe637ae6ad9e378ad3fe2558a9d3cca64ab9d5a6861c not found: ID does not exist" containerID="45818128c5d2d6d5769ffe637ae6ad9e378ad3fe2558a9d3cca64ab9d5a6861c" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.470409 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45818128c5d2d6d5769ffe637ae6ad9e378ad3fe2558a9d3cca64ab9d5a6861c"} err="failed to get container status \"45818128c5d2d6d5769ffe637ae6ad9e378ad3fe2558a9d3cca64ab9d5a6861c\": rpc error: code = NotFound desc = could not find container \"45818128c5d2d6d5769ffe637ae6ad9e378ad3fe2558a9d3cca64ab9d5a6861c\": container with ID starting with 45818128c5d2d6d5769ffe637ae6ad9e378ad3fe2558a9d3cca64ab9d5a6861c not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.470434 4873 scope.go:117] "RemoveContainer" containerID="c2f5ae8579d6418292a6c0f3c975976d1e536a3be874c8a9279ae73eeb01983a" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.470925 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2f5ae8579d6418292a6c0f3c975976d1e536a3be874c8a9279ae73eeb01983a\": container with ID starting with c2f5ae8579d6418292a6c0f3c975976d1e536a3be874c8a9279ae73eeb01983a not found: ID does not exist" containerID="c2f5ae8579d6418292a6c0f3c975976d1e536a3be874c8a9279ae73eeb01983a" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.470967 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2f5ae8579d6418292a6c0f3c975976d1e536a3be874c8a9279ae73eeb01983a"} err="failed to get container status \"c2f5ae8579d6418292a6c0f3c975976d1e536a3be874c8a9279ae73eeb01983a\": rpc error: code = NotFound desc = could not find container \"c2f5ae8579d6418292a6c0f3c975976d1e536a3be874c8a9279ae73eeb01983a\": container with ID starting with c2f5ae8579d6418292a6c0f3c975976d1e536a3be874c8a9279ae73eeb01983a not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.470991 4873 scope.go:117] "RemoveContainer" containerID="74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.473064 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tnf24"] Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.476744 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-86hhq"] Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.481131 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-86hhq"] Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.483610 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjn8l"] Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.485673 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gjn8l"] Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.488898 4873 scope.go:117] "RemoveContainer" containerID="6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.499447 4873 scope.go:117] "RemoveContainer" containerID="8612f8546567d73e69164de7ce77990ef80b0c0484ccccf94d45b84576d5ac5b" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.512786 4873 scope.go:117] "RemoveContainer" containerID="74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.513344 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba\": container with ID starting with 74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba not found: ID does not exist" containerID="74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.513438 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba"} err="failed to get container status \"74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba\": rpc error: code = NotFound desc = could not find container \"74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba\": container with ID starting with 74fb37ff12f72c03a5f94367ab0f45dbbc7b48f6d463c3dd733000d61c96b4ba not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.513520 4873 scope.go:117] "RemoveContainer" containerID="6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.513895 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9\": container with ID starting with 6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9 not found: ID does not exist" containerID="6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.513926 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9"} err="failed to get container status \"6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9\": rpc error: code = NotFound desc = could not find container \"6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9\": container with ID starting with 6e4a6ad4fb008d881b5209c2ea81c21452449e9b024f3846af5079a4f4b5e1a9 not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.513948 4873 scope.go:117] "RemoveContainer" containerID="8612f8546567d73e69164de7ce77990ef80b0c0484ccccf94d45b84576d5ac5b" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.514933 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8612f8546567d73e69164de7ce77990ef80b0c0484ccccf94d45b84576d5ac5b\": container with ID starting with 8612f8546567d73e69164de7ce77990ef80b0c0484ccccf94d45b84576d5ac5b not found: ID does not exist" containerID="8612f8546567d73e69164de7ce77990ef80b0c0484ccccf94d45b84576d5ac5b" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.515014 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8612f8546567d73e69164de7ce77990ef80b0c0484ccccf94d45b84576d5ac5b"} err="failed to get container status \"8612f8546567d73e69164de7ce77990ef80b0c0484ccccf94d45b84576d5ac5b\": rpc error: code = NotFound desc = could not find container \"8612f8546567d73e69164de7ce77990ef80b0c0484ccccf94d45b84576d5ac5b\": container with ID starting with 8612f8546567d73e69164de7ce77990ef80b0c0484ccccf94d45b84576d5ac5b not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.515080 4873 scope.go:117] "RemoveContainer" containerID="1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.547611 4873 scope.go:117] "RemoveContainer" containerID="1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.556774 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026\": container with ID starting with 1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026 not found: ID does not exist" containerID="1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.556825 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026"} err="failed to get container status \"1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026\": rpc error: code = NotFound desc = could not find container \"1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026\": container with ID starting with 1c029703d8f2912597cb5a128fcbab53d9cfdb22e857dd41250cc1badc58b026 not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.556853 4873 scope.go:117] "RemoveContainer" containerID="f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.592333 4873 scope.go:117] "RemoveContainer" containerID="e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.609382 4873 scope.go:117] "RemoveContainer" containerID="9629d2d8b3e9b42462cd22dea41c2d0741500af9092d6a0c14427b12d6440d6e" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.624533 4873 scope.go:117] "RemoveContainer" containerID="f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.624972 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0\": container with ID starting with f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0 not found: ID does not exist" containerID="f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.625009 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0"} err="failed to get container status \"f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0\": rpc error: code = NotFound desc = could not find container \"f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0\": container with ID starting with f88d143b0e1a50c7403539950ca7222c7ff265725c9ef86d1f4fc96a10db2db0 not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.625038 4873 scope.go:117] "RemoveContainer" containerID="e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.625357 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2\": container with ID starting with e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2 not found: ID does not exist" containerID="e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.625384 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2"} err="failed to get container status \"e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2\": rpc error: code = NotFound desc = could not find container \"e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2\": container with ID starting with e509c49bf20ffebae8039816e8a5dbc8ef6b58ea2d9ef21e6d469e9a99d8fcf2 not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.625401 4873 scope.go:117] "RemoveContainer" containerID="9629d2d8b3e9b42462cd22dea41c2d0741500af9092d6a0c14427b12d6440d6e" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.625654 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9629d2d8b3e9b42462cd22dea41c2d0741500af9092d6a0c14427b12d6440d6e\": container with ID starting with 9629d2d8b3e9b42462cd22dea41c2d0741500af9092d6a0c14427b12d6440d6e not found: ID does not exist" containerID="9629d2d8b3e9b42462cd22dea41c2d0741500af9092d6a0c14427b12d6440d6e" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.625678 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9629d2d8b3e9b42462cd22dea41c2d0741500af9092d6a0c14427b12d6440d6e"} err="failed to get container status \"9629d2d8b3e9b42462cd22dea41c2d0741500af9092d6a0c14427b12d6440d6e\": rpc error: code = NotFound desc = could not find container \"9629d2d8b3e9b42462cd22dea41c2d0741500af9092d6a0c14427b12d6440d6e\": container with ID starting with 9629d2d8b3e9b42462cd22dea41c2d0741500af9092d6a0c14427b12d6440d6e not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.625696 4873 scope.go:117] "RemoveContainer" containerID="65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.636646 4873 scope.go:117] "RemoveContainer" containerID="1479a6032ed4c9b61367740ab02f5b5f34a581176449601b359feaf96fa47f0a" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.650654 4873 scope.go:117] "RemoveContainer" containerID="ec81b30df0c0381b7218bf27c1e036ec8790d0ece7bfe1e31d1cbe023c4a6210" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.670346 4873 scope.go:117] "RemoveContainer" containerID="65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.674484 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf\": container with ID starting with 65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf not found: ID does not exist" containerID="65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.674520 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf"} err="failed to get container status \"65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf\": rpc error: code = NotFound desc = could not find container \"65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf\": container with ID starting with 65f03008f264ea89ece1ee912ca47401c748ee7054983199b5bd318655ded9bf not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.674546 4873 scope.go:117] "RemoveContainer" containerID="1479a6032ed4c9b61367740ab02f5b5f34a581176449601b359feaf96fa47f0a" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.674890 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1479a6032ed4c9b61367740ab02f5b5f34a581176449601b359feaf96fa47f0a\": container with ID starting with 1479a6032ed4c9b61367740ab02f5b5f34a581176449601b359feaf96fa47f0a not found: ID does not exist" containerID="1479a6032ed4c9b61367740ab02f5b5f34a581176449601b359feaf96fa47f0a" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.674915 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1479a6032ed4c9b61367740ab02f5b5f34a581176449601b359feaf96fa47f0a"} err="failed to get container status \"1479a6032ed4c9b61367740ab02f5b5f34a581176449601b359feaf96fa47f0a\": rpc error: code = NotFound desc = could not find container \"1479a6032ed4c9b61367740ab02f5b5f34a581176449601b359feaf96fa47f0a\": container with ID starting with 1479a6032ed4c9b61367740ab02f5b5f34a581176449601b359feaf96fa47f0a not found: ID does not exist" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.674935 4873 scope.go:117] "RemoveContainer" containerID="ec81b30df0c0381b7218bf27c1e036ec8790d0ece7bfe1e31d1cbe023c4a6210" Feb 19 09:48:42 crc kubenswrapper[4873]: E0219 09:48:42.675365 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec81b30df0c0381b7218bf27c1e036ec8790d0ece7bfe1e31d1cbe023c4a6210\": container with ID starting with ec81b30df0c0381b7218bf27c1e036ec8790d0ece7bfe1e31d1cbe023c4a6210 not found: ID does not exist" containerID="ec81b30df0c0381b7218bf27c1e036ec8790d0ece7bfe1e31d1cbe023c4a6210" Feb 19 09:48:42 crc kubenswrapper[4873]: I0219 09:48:42.675396 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec81b30df0c0381b7218bf27c1e036ec8790d0ece7bfe1e31d1cbe023c4a6210"} err="failed to get container status \"ec81b30df0c0381b7218bf27c1e036ec8790d0ece7bfe1e31d1cbe023c4a6210\": rpc error: code = NotFound desc = could not find container \"ec81b30df0c0381b7218bf27c1e036ec8790d0ece7bfe1e31d1cbe023c4a6210\": container with ID starting with ec81b30df0c0381b7218bf27c1e036ec8790d0ece7bfe1e31d1cbe023c4a6210 not found: ID does not exist" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.398472 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" event={"ID":"1d58439b-31c6-44df-a32d-48f0fcb6a361","Type":"ContainerStarted","Data":"21df5c2a915a009ceef25d6825f5c78f78a2145abfb94dd37b528ab70fa879e1"} Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.398795 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.403309 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.422611 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jt9rj" podStartSLOduration=2.422594324 podStartE2EDuration="2.422594324s" podCreationTimestamp="2026-02-19 09:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:48:43.417956367 +0000 UTC m=+232.707388035" watchObservedRunningTime="2026-02-19 09:48:43.422594324 +0000 UTC m=+232.712025962" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.492948 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0954690a-09f0-4b1b-be57-db87e9304488" path="/var/lib/kubelet/pods/0954690a-09f0-4b1b-be57-db87e9304488/volumes" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.494184 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7423538a-949c-4995-bcf8-f2b6a2f8d914" path="/var/lib/kubelet/pods/7423538a-949c-4995-bcf8-f2b6a2f8d914/volumes" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.495450 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" path="/var/lib/kubelet/pods/9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3/volumes" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.496977 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d58373-fe5d-4afe-9da1-256843164ff4" path="/var/lib/kubelet/pods/d5d58373-fe5d-4afe-9da1-256843164ff4/volumes" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.497839 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f905b5ea-71df-4b1c-997c-d68766bcfcfe" path="/var/lib/kubelet/pods/f905b5ea-71df-4b1c-997c-d68766bcfcfe/volumes" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623343 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c2d4s"] Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623618 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerName="extract-utilities" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623646 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerName="extract-utilities" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623671 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623684 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623698 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" containerName="extract-content" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623710 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" containerName="extract-content" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623726 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623737 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623751 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0954690a-09f0-4b1b-be57-db87e9304488" containerName="extract-content" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623762 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0954690a-09f0-4b1b-be57-db87e9304488" containerName="extract-content" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623775 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" containerName="extract-utilities" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623785 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" containerName="extract-utilities" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623798 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerName="extract-content" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623809 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerName="extract-content" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623825 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d58373-fe5d-4afe-9da1-256843164ff4" containerName="extract-utilities" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623837 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d58373-fe5d-4afe-9da1-256843164ff4" containerName="extract-utilities" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623855 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d58373-fe5d-4afe-9da1-256843164ff4" containerName="extract-content" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623866 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d58373-fe5d-4afe-9da1-256843164ff4" containerName="extract-content" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623879 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0954690a-09f0-4b1b-be57-db87e9304488" containerName="extract-utilities" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623890 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0954690a-09f0-4b1b-be57-db87e9304488" containerName="extract-utilities" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623904 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0954690a-09f0-4b1b-be57-db87e9304488" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623915 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0954690a-09f0-4b1b-be57-db87e9304488" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623929 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d58373-fe5d-4afe-9da1-256843164ff4" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623940 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d58373-fe5d-4afe-9da1-256843164ff4" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: E0219 09:48:43.623952 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f905b5ea-71df-4b1c-997c-d68766bcfcfe" containerName="marketplace-operator" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.623963 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f905b5ea-71df-4b1c-997c-d68766bcfcfe" containerName="marketplace-operator" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.624125 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f905b5ea-71df-4b1c-997c-d68766bcfcfe" containerName="marketplace-operator" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.624154 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0954690a-09f0-4b1b-be57-db87e9304488" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.624168 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d58373-fe5d-4afe-9da1-256843164ff4" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.624180 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="7423538a-949c-4995-bcf8-f2b6a2f8d914" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.624196 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9a2892-8bf0-451c-b46c-eac8a5fe1ce3" containerName="registry-server" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.625216 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.627399 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.640770 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2d4s"] Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.774700 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8j57\" (UniqueName: \"kubernetes.io/projected/92377803-fb7e-42d1-ba93-54235a8f9409-kube-api-access-g8j57\") pod \"certified-operators-c2d4s\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.774774 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-utilities\") pod \"certified-operators-c2d4s\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.774822 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-catalog-content\") pod \"certified-operators-c2d4s\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.830213 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xvshp"] Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.832891 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.834932 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvshp"] Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.835350 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.876007 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8j57\" (UniqueName: \"kubernetes.io/projected/92377803-fb7e-42d1-ba93-54235a8f9409-kube-api-access-g8j57\") pod \"certified-operators-c2d4s\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.876140 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-utilities\") pod \"certified-operators-c2d4s\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.876212 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-catalog-content\") pod \"certified-operators-c2d4s\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.876801 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-utilities\") pod \"certified-operators-c2d4s\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.878387 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-catalog-content\") pod \"certified-operators-c2d4s\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.894477 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8j57\" (UniqueName: \"kubernetes.io/projected/92377803-fb7e-42d1-ba93-54235a8f9409-kube-api-access-g8j57\") pod \"certified-operators-c2d4s\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.946125 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.976883 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a9b521-3ed0-40c1-b38f-34c21bd9c242-utilities\") pod \"redhat-marketplace-xvshp\" (UID: \"f9a9b521-3ed0-40c1-b38f-34c21bd9c242\") " pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.977054 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a9b521-3ed0-40c1-b38f-34c21bd9c242-catalog-content\") pod \"redhat-marketplace-xvshp\" (UID: \"f9a9b521-3ed0-40c1-b38f-34c21bd9c242\") " pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:43 crc kubenswrapper[4873]: I0219 09:48:43.977200 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnjw8\" (UniqueName: \"kubernetes.io/projected/f9a9b521-3ed0-40c1-b38f-34c21bd9c242-kube-api-access-hnjw8\") pod \"redhat-marketplace-xvshp\" (UID: \"f9a9b521-3ed0-40c1-b38f-34c21bd9c242\") " pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:44 crc kubenswrapper[4873]: I0219 09:48:44.078773 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a9b521-3ed0-40c1-b38f-34c21bd9c242-catalog-content\") pod \"redhat-marketplace-xvshp\" (UID: \"f9a9b521-3ed0-40c1-b38f-34c21bd9c242\") " pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:44 crc kubenswrapper[4873]: I0219 09:48:44.079117 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnjw8\" (UniqueName: \"kubernetes.io/projected/f9a9b521-3ed0-40c1-b38f-34c21bd9c242-kube-api-access-hnjw8\") pod \"redhat-marketplace-xvshp\" (UID: \"f9a9b521-3ed0-40c1-b38f-34c21bd9c242\") " pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:44 crc kubenswrapper[4873]: I0219 09:48:44.079255 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a9b521-3ed0-40c1-b38f-34c21bd9c242-utilities\") pod \"redhat-marketplace-xvshp\" (UID: \"f9a9b521-3ed0-40c1-b38f-34c21bd9c242\") " pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:44 crc kubenswrapper[4873]: I0219 09:48:44.079760 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a9b521-3ed0-40c1-b38f-34c21bd9c242-utilities\") pod \"redhat-marketplace-xvshp\" (UID: \"f9a9b521-3ed0-40c1-b38f-34c21bd9c242\") " pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:44 crc kubenswrapper[4873]: I0219 09:48:44.081671 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a9b521-3ed0-40c1-b38f-34c21bd9c242-catalog-content\") pod \"redhat-marketplace-xvshp\" (UID: \"f9a9b521-3ed0-40c1-b38f-34c21bd9c242\") " pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:44 crc kubenswrapper[4873]: I0219 09:48:44.097524 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnjw8\" (UniqueName: \"kubernetes.io/projected/f9a9b521-3ed0-40c1-b38f-34c21bd9c242-kube-api-access-hnjw8\") pod \"redhat-marketplace-xvshp\" (UID: \"f9a9b521-3ed0-40c1-b38f-34c21bd9c242\") " pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:44 crc kubenswrapper[4873]: I0219 09:48:44.152632 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:44 crc kubenswrapper[4873]: I0219 09:48:44.377755 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2d4s"] Feb 19 09:48:44 crc kubenswrapper[4873]: W0219 09:48:44.384810 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92377803_fb7e_42d1_ba93_54235a8f9409.slice/crio-a08dd4d7e39597a08d46eee691df8d3e8119bb68e610a407ed93ede91eb7581e WatchSource:0}: Error finding container a08dd4d7e39597a08d46eee691df8d3e8119bb68e610a407ed93ede91eb7581e: Status 404 returned error can't find the container with id a08dd4d7e39597a08d46eee691df8d3e8119bb68e610a407ed93ede91eb7581e Feb 19 09:48:44 crc kubenswrapper[4873]: I0219 09:48:44.409683 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2d4s" event={"ID":"92377803-fb7e-42d1-ba93-54235a8f9409","Type":"ContainerStarted","Data":"a08dd4d7e39597a08d46eee691df8d3e8119bb68e610a407ed93ede91eb7581e"} Feb 19 09:48:44 crc kubenswrapper[4873]: I0219 09:48:44.521349 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvshp"] Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.419780 4873 generic.go:334] "Generic (PLEG): container finished" podID="92377803-fb7e-42d1-ba93-54235a8f9409" containerID="09d79d6a44f7ab2bd840bebd67e2c4ff2d2bdef097d6feda8d881b4512fee593" exitCode=0 Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.419833 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2d4s" event={"ID":"92377803-fb7e-42d1-ba93-54235a8f9409","Type":"ContainerDied","Data":"09d79d6a44f7ab2bd840bebd67e2c4ff2d2bdef097d6feda8d881b4512fee593"} Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.423650 4873 generic.go:334] "Generic (PLEG): container finished" podID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" containerID="5ac66b7adc65a8e969f9606c8d6fe67c07864141a7835a55fb84b7ab451b4eaf" exitCode=0 Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.423723 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvshp" event={"ID":"f9a9b521-3ed0-40c1-b38f-34c21bd9c242","Type":"ContainerDied","Data":"5ac66b7adc65a8e969f9606c8d6fe67c07864141a7835a55fb84b7ab451b4eaf"} Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.423762 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvshp" event={"ID":"f9a9b521-3ed0-40c1-b38f-34c21bd9c242","Type":"ContainerStarted","Data":"2b8f61d8cd4134ecf14385216fe4ce55e1fb1ff7e28906f0dc379ffee35e93de"} Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.696227 4873 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.696808 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117" gracePeriod=15 Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.696864 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3" gracePeriod=15 Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.696937 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11" gracePeriod=15 Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.696895 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e" gracePeriod=15 Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.696919 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906" gracePeriod=15 Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.698679 4873 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 09:48:45 crc kubenswrapper[4873]: E0219 09:48:45.698905 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.698919 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 09:48:45 crc kubenswrapper[4873]: E0219 09:48:45.698929 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.698937 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 09:48:45 crc kubenswrapper[4873]: E0219 09:48:45.698951 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.698958 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 09:48:45 crc kubenswrapper[4873]: E0219 09:48:45.698969 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.698977 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 09:48:45 crc kubenswrapper[4873]: E0219 09:48:45.698986 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.698992 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 09:48:45 crc kubenswrapper[4873]: E0219 09:48:45.699003 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.699009 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 09:48:45 crc kubenswrapper[4873]: E0219 09:48:45.699028 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.699037 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.699172 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.699185 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.699195 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.699203 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.699216 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.699231 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.699243 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 09:48:45 crc kubenswrapper[4873]: E0219 09:48:45.699345 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.699355 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.700568 4873 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.704641 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.712678 4873 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 19 09:48:45 crc kubenswrapper[4873]: E0219 09:48:45.770287 4873 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.156:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.799873 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.799915 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.799931 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.799948 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.799973 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.800035 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.800077 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.800156 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901069 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901284 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901200 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901347 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901386 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901428 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901484 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901556 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901609 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901656 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901700 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901816 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901882 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901927 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.901969 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: I0219 09:48:45.902010 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:45 crc kubenswrapper[4873]: E0219 09:48:45.972806 4873 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.156:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-xvshp.18959ce2b614bbec openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-xvshp,UID:f9a9b521-3ed0-40c1-b38f-34c21bd9c242,APIVersion:v1,ResourceVersion:29638,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\" in 545ms (545ms including waiting). Image size: 1202767548 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 09:48:45.9715123 +0000 UTC m=+235.260943958,LastTimestamp:2026-02-19 09:48:45.9715123 +0000 UTC m=+235.260943958,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.071240 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:46 crc kubenswrapper[4873]: W0219 09:48:46.086038 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-efad096894788afb5a1c67b77f3e0f83ab20ad0047d03e3c2dac5a8b464b7df4 WatchSource:0}: Error finding container efad096894788afb5a1c67b77f3e0f83ab20ad0047d03e3c2dac5a8b464b7df4: Status 404 returned error can't find the container with id efad096894788afb5a1c67b77f3e0f83ab20ad0047d03e3c2dac5a8b464b7df4 Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.429372 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1cd15f918dfe991d39aa6c783668167ec9ec210784e46772d7b52f35e68404c5"} Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.429686 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"efad096894788afb5a1c67b77f3e0f83ab20ad0047d03e3c2dac5a8b464b7df4"} Feb 19 09:48:46 crc kubenswrapper[4873]: E0219 09:48:46.430350 4873 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.156:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.431419 4873 generic.go:334] "Generic (PLEG): container finished" podID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" containerID="15c977fb8314904bd6e91aa0233a7a89db6c5ac04c1a6328daf2247a377c9f30" exitCode=0 Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.431472 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvshp" event={"ID":"f9a9b521-3ed0-40c1-b38f-34c21bd9c242","Type":"ContainerDied","Data":"15c977fb8314904bd6e91aa0233a7a89db6c5ac04c1a6328daf2247a377c9f30"} Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.432047 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.433375 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.436532 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.437385 4873 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3" exitCode=0 Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.437482 4873 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11" exitCode=0 Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.437542 4873 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e" exitCode=0 Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.437596 4873 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906" exitCode=2 Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.437708 4873 scope.go:117] "RemoveContainer" containerID="7b5c2e53f92fb4360b78c1a61cdf6f27c617ddd4e99e6a2e079c7f4f3bcc1e90" Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.439682 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2d4s" event={"ID":"92377803-fb7e-42d1-ba93-54235a8f9409","Type":"ContainerStarted","Data":"5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2"} Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.440365 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.440733 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.441691 4873 generic.go:334] "Generic (PLEG): container finished" podID="dcd45a6e-fa80-4995-bab8-20796784d618" containerID="12e07b634f8034e56f9833d14110782d34f2365b31aa7149ce239d933850da51" exitCode=0 Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.441730 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dcd45a6e-fa80-4995-bab8-20796784d618","Type":"ContainerDied","Data":"12e07b634f8034e56f9833d14110782d34f2365b31aa7149ce239d933850da51"} Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.442268 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.442638 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:46 crc kubenswrapper[4873]: I0219 09:48:46.442897 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.449950 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvshp" event={"ID":"f9a9b521-3ed0-40c1-b38f-34c21bd9c242","Type":"ContainerStarted","Data":"3ffce5a7300dbbb711c51003ccc1526dc7263702b970e7431a505507cdc556c3"} Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.451007 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.451398 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.451721 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.454644 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.465225 4873 generic.go:334] "Generic (PLEG): container finished" podID="92377803-fb7e-42d1-ba93-54235a8f9409" containerID="5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2" exitCode=0 Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.465561 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2d4s" event={"ID":"92377803-fb7e-42d1-ba93-54235a8f9409","Type":"ContainerDied","Data":"5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2"} Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.466158 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.466358 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.466552 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.701963 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.702919 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.703679 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.704094 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.825306 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-kubelet-dir\") pod \"dcd45a6e-fa80-4995-bab8-20796784d618\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.825642 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd45a6e-fa80-4995-bab8-20796784d618-kube-api-access\") pod \"dcd45a6e-fa80-4995-bab8-20796784d618\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.825708 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-var-lock\") pod \"dcd45a6e-fa80-4995-bab8-20796784d618\" (UID: \"dcd45a6e-fa80-4995-bab8-20796784d618\") " Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.825419 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dcd45a6e-fa80-4995-bab8-20796784d618" (UID: "dcd45a6e-fa80-4995-bab8-20796784d618"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.825885 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-var-lock" (OuterVolumeSpecName: "var-lock") pod "dcd45a6e-fa80-4995-bab8-20796784d618" (UID: "dcd45a6e-fa80-4995-bab8-20796784d618"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.825937 4873 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.846484 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd45a6e-fa80-4995-bab8-20796784d618-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dcd45a6e-fa80-4995-bab8-20796784d618" (UID: "dcd45a6e-fa80-4995-bab8-20796784d618"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.926604 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd45a6e-fa80-4995-bab8-20796784d618-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:47 crc kubenswrapper[4873]: I0219 09:48:47.926637 4873 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dcd45a6e-fa80-4995-bab8-20796784d618-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.116976 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.117999 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.118754 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.119163 4873 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.119577 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.119801 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.229175 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.229227 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.229310 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.229561 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.229618 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.229650 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.331026 4873 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.331062 4873 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.331072 4873 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.473533 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.474260 4873 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117" exitCode=0 Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.474339 4873 scope.go:117] "RemoveContainer" containerID="aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.474369 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.477201 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2d4s" event={"ID":"92377803-fb7e-42d1-ba93-54235a8f9409","Type":"ContainerStarted","Data":"9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a"} Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.477898 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.478282 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.478608 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dcd45a6e-fa80-4995-bab8-20796784d618","Type":"ContainerDied","Data":"3520c7c7b57ec4d387d5a52af5a8868db2183fc59373efaee14a1f7ca6894456"} Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.478665 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3520c7c7b57ec4d387d5a52af5a8868db2183fc59373efaee14a1f7ca6894456" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.478747 4873 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.479153 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.479196 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.488675 4873 scope.go:117] "RemoveContainer" containerID="b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.488715 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.488913 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.489114 4873 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.489415 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.499092 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.499334 4873 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.499546 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.499747 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.500146 4873 scope.go:117] "RemoveContainer" containerID="f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.522697 4873 scope.go:117] "RemoveContainer" containerID="982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.539801 4873 scope.go:117] "RemoveContainer" containerID="cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.572453 4873 scope.go:117] "RemoveContainer" containerID="0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.598687 4873 scope.go:117] "RemoveContainer" containerID="aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3" Feb 19 09:48:48 crc kubenswrapper[4873]: E0219 09:48:48.599177 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\": container with ID starting with aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3 not found: ID does not exist" containerID="aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.599204 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3"} err="failed to get container status \"aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\": rpc error: code = NotFound desc = could not find container \"aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3\": container with ID starting with aa7017087969fb77ee99eae01d8a0baf7383b98a6f09d835a7d8937ade8a1ca3 not found: ID does not exist" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.599223 4873 scope.go:117] "RemoveContainer" containerID="b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11" Feb 19 09:48:48 crc kubenswrapper[4873]: E0219 09:48:48.605844 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\": container with ID starting with b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11 not found: ID does not exist" containerID="b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.605902 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11"} err="failed to get container status \"b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\": rpc error: code = NotFound desc = could not find container \"b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11\": container with ID starting with b13498b714daa09f61bc71b622ad5fec9253b4cf67b602fc089424bb79794e11 not found: ID does not exist" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.605937 4873 scope.go:117] "RemoveContainer" containerID="f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e" Feb 19 09:48:48 crc kubenswrapper[4873]: E0219 09:48:48.606394 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\": container with ID starting with f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e not found: ID does not exist" containerID="f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.606429 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e"} err="failed to get container status \"f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\": rpc error: code = NotFound desc = could not find container \"f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e\": container with ID starting with f7bacb25129618d5756053587a398d7e6b60b1ebd6d218a88d3780851edf974e not found: ID does not exist" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.606450 4873 scope.go:117] "RemoveContainer" containerID="982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906" Feb 19 09:48:48 crc kubenswrapper[4873]: E0219 09:48:48.606709 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\": container with ID starting with 982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906 not found: ID does not exist" containerID="982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.606738 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906"} err="failed to get container status \"982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\": rpc error: code = NotFound desc = could not find container \"982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906\": container with ID starting with 982ba46d11a4459c720ef1c2ac6668e88b58b96dca42d0085e4ae42b7a9c0906 not found: ID does not exist" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.606755 4873 scope.go:117] "RemoveContainer" containerID="cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117" Feb 19 09:48:48 crc kubenswrapper[4873]: E0219 09:48:48.607014 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\": container with ID starting with cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117 not found: ID does not exist" containerID="cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.607051 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117"} err="failed to get container status \"cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\": rpc error: code = NotFound desc = could not find container \"cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117\": container with ID starting with cef5813f7cb2a1b36ecca7bb340e35f71b0b1a53edb19cd9bafda79e362ab117 not found: ID does not exist" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.607095 4873 scope.go:117] "RemoveContainer" containerID="0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279" Feb 19 09:48:48 crc kubenswrapper[4873]: E0219 09:48:48.607609 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\": container with ID starting with 0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279 not found: ID does not exist" containerID="0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279" Feb 19 09:48:48 crc kubenswrapper[4873]: I0219 09:48:48.607647 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279"} err="failed to get container status \"0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\": rpc error: code = NotFound desc = could not find container \"0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279\": container with ID starting with 0dad8f7655015996610223e9351f82201d43903a0c017a8be9a56240bc160279 not found: ID does not exist" Feb 19 09:48:49 crc kubenswrapper[4873]: I0219 09:48:49.490434 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 19 09:48:50 crc kubenswrapper[4873]: E0219 09:48:50.715542 4873 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:50 crc kubenswrapper[4873]: E0219 09:48:50.716366 4873 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:50 crc kubenswrapper[4873]: E0219 09:48:50.716736 4873 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:50 crc kubenswrapper[4873]: E0219 09:48:50.716955 4873 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:50 crc kubenswrapper[4873]: E0219 09:48:50.717179 4873 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:50 crc kubenswrapper[4873]: I0219 09:48:50.717209 4873 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 19 09:48:50 crc kubenswrapper[4873]: E0219 09:48:50.717445 4873 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" interval="200ms" Feb 19 09:48:50 crc kubenswrapper[4873]: E0219 09:48:50.918057 4873 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" interval="400ms" Feb 19 09:48:51 crc kubenswrapper[4873]: E0219 09:48:51.318899 4873 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" interval="800ms" Feb 19 09:48:51 crc kubenswrapper[4873]: I0219 09:48:51.486433 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:51 crc kubenswrapper[4873]: I0219 09:48:51.486815 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:51 crc kubenswrapper[4873]: I0219 09:48:51.487055 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:52 crc kubenswrapper[4873]: E0219 09:48:52.120516 4873 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" interval="1.6s" Feb 19 09:48:52 crc kubenswrapper[4873]: E0219 09:48:52.205986 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:48:52Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:48:52Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:48:52Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T09:48:52Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:52 crc kubenswrapper[4873]: E0219 09:48:52.206548 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:52 crc kubenswrapper[4873]: E0219 09:48:52.206975 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:52 crc kubenswrapper[4873]: E0219 09:48:52.207364 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:52 crc kubenswrapper[4873]: E0219 09:48:52.207705 4873 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:52 crc kubenswrapper[4873]: E0219 09:48:52.207732 4873 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 09:48:52 crc kubenswrapper[4873]: E0219 09:48:52.517951 4873 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.156:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-xvshp.18959ce2b614bbec openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-xvshp,UID:f9a9b521-3ed0-40c1-b38f-34c21bd9c242,APIVersion:v1,ResourceVersion:29638,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\" in 545ms (545ms including waiting). Image size: 1202767548 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 09:48:45.9715123 +0000 UTC m=+235.260943958,LastTimestamp:2026-02-19 09:48:45.9715123 +0000 UTC m=+235.260943958,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 09:48:53 crc kubenswrapper[4873]: E0219 09:48:53.724845 4873 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" interval="3.2s" Feb 19 09:48:53 crc kubenswrapper[4873]: I0219 09:48:53.946393 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:53 crc kubenswrapper[4873]: I0219 09:48:53.946476 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.021979 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.022614 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.023485 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.024408 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.153332 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.153397 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.192654 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.193888 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.194416 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.194875 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.561178 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.562385 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.563041 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.563577 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.571317 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xvshp" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.572060 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.572649 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:54 crc kubenswrapper[4873]: I0219 09:48:54.573088 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:56 crc kubenswrapper[4873]: E0219 09:48:56.926833 4873 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.156:6443: connect: connection refused" interval="6.4s" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.483544 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.484714 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.485388 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.485751 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.500644 4873 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d6df925a-1654-4ade-a300-97c316b0867f" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.500691 4873 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d6df925a-1654-4ade-a300-97c316b0867f" Feb 19 09:48:59 crc kubenswrapper[4873]: E0219 09:48:59.501377 4873 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.502079 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:48:59 crc kubenswrapper[4873]: W0219 09:48:59.532979 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-34b4fa3b0b6b818a8178957453a86635ecce5a1a6de8a875bd50c7fc86caf429 WatchSource:0}: Error finding container 34b4fa3b0b6b818a8178957453a86635ecce5a1a6de8a875bd50c7fc86caf429: Status 404 returned error can't find the container with id 34b4fa3b0b6b818a8178957453a86635ecce5a1a6de8a875bd50c7fc86caf429 Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.557255 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.557345 4873 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850" exitCode=1 Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.557424 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850"} Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.557980 4873 scope.go:117] "RemoveContainer" containerID="c9e9d7320b0e9d2d241675d65d266b0cffe1c60ea0a221261787b340ca04d850" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.559139 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"34b4fa3b0b6b818a8178957453a86635ecce5a1a6de8a875bd50c7fc86caf429"} Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.559212 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.559570 4873 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.560006 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:48:59 crc kubenswrapper[4873]: I0219 09:48:59.560343 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.566658 4873 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="916aafadcf66dd970321f181045d851463d2f96fb391a8b557bae5ea0786a4ab" exitCode=0 Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.566743 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"916aafadcf66dd970321f181045d851463d2f96fb391a8b557bae5ea0786a4ab"} Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.567233 4873 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d6df925a-1654-4ade-a300-97c316b0867f" Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.567281 4873 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d6df925a-1654-4ade-a300-97c316b0867f" Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.567781 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:49:00 crc kubenswrapper[4873]: E0219 09:49:00.567920 4873 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.568381 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.568647 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.568969 4873 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.572772 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.572866 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"af51a971d2d0522bb9002b49132470bda34bb1a02ca60cfc82d1929f48cb112d"} Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.573755 4873 status_manager.go:851] "Failed to get status for pod" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" pod="openshift-marketplace/certified-operators-c2d4s" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c2d4s\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.574173 4873 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.574487 4873 status_manager.go:851] "Failed to get status for pod" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:49:00 crc kubenswrapper[4873]: I0219 09:49:00.574993 4873 status_manager.go:851] "Failed to get status for pod" podUID="f9a9b521-3ed0-40c1-b38f-34c21bd9c242" pod="openshift-marketplace/redhat-marketplace-xvshp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xvshp\": dial tcp 38.102.83.156:6443: connect: connection refused" Feb 19 09:49:01 crc kubenswrapper[4873]: I0219 09:49:01.589649 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c6339bec1dca8b314c006b7a4864177da334591576eecd00b09d4f3521a49b38"} Feb 19 09:49:01 crc kubenswrapper[4873]: I0219 09:49:01.589988 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3888a3b0e14c6dec78fe115c9eadc62ab000440126248d1d59d55eced4929f06"} Feb 19 09:49:01 crc kubenswrapper[4873]: I0219 09:49:01.590001 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5b3dd4ce991aad56ebcb19abc048af169ba47d9e4a3dbc58596263e422f7727d"} Feb 19 09:49:01 crc kubenswrapper[4873]: I0219 09:49:01.590011 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f256ef8b0b3b7af748ec303bc2ecdf5f84bdedd23b3bed4d545fb0d12fab15ff"} Feb 19 09:49:02 crc kubenswrapper[4873]: I0219 09:49:02.599978 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4b6b3848cd78789578434ba6cfb13eae1a718d6e12f52db62ef820a5ea154e2a"} Feb 19 09:49:02 crc kubenswrapper[4873]: I0219 09:49:02.600290 4873 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d6df925a-1654-4ade-a300-97c316b0867f" Feb 19 09:49:02 crc kubenswrapper[4873]: I0219 09:49:02.600307 4873 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d6df925a-1654-4ade-a300-97c316b0867f" Feb 19 09:49:02 crc kubenswrapper[4873]: I0219 09:49:02.600501 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:49:04 crc kubenswrapper[4873]: I0219 09:49:04.502649 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:49:04 crc kubenswrapper[4873]: I0219 09:49:04.502878 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:49:04 crc kubenswrapper[4873]: I0219 09:49:04.510228 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:49:07 crc kubenswrapper[4873]: I0219 09:49:07.614042 4873 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:49:07 crc kubenswrapper[4873]: I0219 09:49:07.658850 4873 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6a9bbc9c-4111-45d8-a138-a15a31edae0a" Feb 19 09:49:07 crc kubenswrapper[4873]: I0219 09:49:07.998798 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 09:49:08 crc kubenswrapper[4873]: I0219 09:49:08.644303 4873 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d6df925a-1654-4ade-a300-97c316b0867f" Feb 19 09:49:08 crc kubenswrapper[4873]: I0219 09:49:08.644901 4873 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d6df925a-1654-4ade-a300-97c316b0867f" Feb 19 09:49:08 crc kubenswrapper[4873]: I0219 09:49:08.647188 4873 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6a9bbc9c-4111-45d8-a138-a15a31edae0a" Feb 19 09:49:08 crc kubenswrapper[4873]: I0219 09:49:08.655327 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 09:49:08 crc kubenswrapper[4873]: I0219 09:49:08.656173 4873 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 09:49:08 crc kubenswrapper[4873]: I0219 09:49:08.656218 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 09:49:08 crc kubenswrapper[4873]: I0219 09:49:08.669888 4873 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://f256ef8b0b3b7af748ec303bc2ecdf5f84bdedd23b3bed4d545fb0d12fab15ff" Feb 19 09:49:08 crc kubenswrapper[4873]: I0219 09:49:08.669942 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:49:09 crc kubenswrapper[4873]: I0219 09:49:09.650283 4873 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d6df925a-1654-4ade-a300-97c316b0867f" Feb 19 09:49:09 crc kubenswrapper[4873]: I0219 09:49:09.650635 4873 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d6df925a-1654-4ade-a300-97c316b0867f" Feb 19 09:49:09 crc kubenswrapper[4873]: I0219 09:49:09.655228 4873 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6a9bbc9c-4111-45d8-a138-a15a31edae0a" Feb 19 09:49:17 crc kubenswrapper[4873]: I0219 09:49:17.415809 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.107083 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.270829 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.413203 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.448129 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.520232 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.592126 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.619086 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.656008 4873 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.656087 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.692183 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.831174 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 09:49:18 crc kubenswrapper[4873]: I0219 09:49:18.894703 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 09:49:19 crc kubenswrapper[4873]: I0219 09:49:19.146755 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 09:49:19 crc kubenswrapper[4873]: I0219 09:49:19.321145 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 09:49:19 crc kubenswrapper[4873]: I0219 09:49:19.405568 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 09:49:19 crc kubenswrapper[4873]: I0219 09:49:19.523386 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 09:49:19 crc kubenswrapper[4873]: I0219 09:49:19.631074 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 09:49:19 crc kubenswrapper[4873]: I0219 09:49:19.675243 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 09:49:19 crc kubenswrapper[4873]: I0219 09:49:19.721733 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 09:49:19 crc kubenswrapper[4873]: I0219 09:49:19.822939 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 09:49:19 crc kubenswrapper[4873]: I0219 09:49:19.896002 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 09:49:19 crc kubenswrapper[4873]: I0219 09:49:19.994751 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 09:49:20 crc kubenswrapper[4873]: I0219 09:49:20.151436 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 09:49:20 crc kubenswrapper[4873]: I0219 09:49:20.261402 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 09:49:20 crc kubenswrapper[4873]: I0219 09:49:20.284700 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 09:49:20 crc kubenswrapper[4873]: I0219 09:49:20.403825 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 09:49:20 crc kubenswrapper[4873]: I0219 09:49:20.405436 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 09:49:20 crc kubenswrapper[4873]: I0219 09:49:20.516491 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 09:49:20 crc kubenswrapper[4873]: I0219 09:49:20.528971 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 09:49:20 crc kubenswrapper[4873]: I0219 09:49:20.842881 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.084966 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.183228 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.212625 4873 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.214924 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xvshp" podStartSLOduration=36.736770484 podStartE2EDuration="38.214905505s" podCreationTimestamp="2026-02-19 09:48:43 +0000 UTC" firstStartedPulling="2026-02-19 09:48:45.425537246 +0000 UTC m=+234.714968884" lastFinishedPulling="2026-02-19 09:48:46.903672267 +0000 UTC m=+236.193103905" observedRunningTime="2026-02-19 09:49:07.721762364 +0000 UTC m=+257.011194002" watchObservedRunningTime="2026-02-19 09:49:21.214905505 +0000 UTC m=+270.504337163" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.216226 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c2d4s" podStartSLOduration=35.72569925 podStartE2EDuration="38.216215268s" podCreationTimestamp="2026-02-19 09:48:43 +0000 UTC" firstStartedPulling="2026-02-19 09:48:45.422859198 +0000 UTC m=+234.712290836" lastFinishedPulling="2026-02-19 09:48:47.913375216 +0000 UTC m=+237.202806854" observedRunningTime="2026-02-19 09:49:07.60244869 +0000 UTC m=+256.891880328" watchObservedRunningTime="2026-02-19 09:49:21.216215268 +0000 UTC m=+270.505646936" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.217938 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.217980 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.226885 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.228058 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.228660 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.246834 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.246818465 podStartE2EDuration="14.246818465s" podCreationTimestamp="2026-02-19 09:49:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:49:21.246095067 +0000 UTC m=+270.535526725" watchObservedRunningTime="2026-02-19 09:49:21.246818465 +0000 UTC m=+270.536250103" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.393056 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.463138 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.626925 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.638011 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.751467 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.755462 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.821616 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.894010 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.902062 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.903170 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 09:49:21 crc kubenswrapper[4873]: I0219 09:49:21.991617 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.001434 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.010434 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.021727 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.150957 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.205686 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.274735 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.318183 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.346045 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.371576 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.375533 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.395812 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.433173 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.469781 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.470286 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.479525 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.575967 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.667981 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.736086 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.810723 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.816678 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.832943 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.833090 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.872091 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 09:49:22 crc kubenswrapper[4873]: I0219 09:49:22.967830 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.014080 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.017576 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.062113 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.146224 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.290773 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.521172 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.572821 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.581083 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.641614 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.641652 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.724562 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.731493 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.830070 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 09:49:23 crc kubenswrapper[4873]: I0219 09:49:23.832008 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.036218 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.123970 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.210965 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.324131 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.330584 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.341532 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.352084 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.525695 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.540597 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.542540 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.555776 4873 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.572424 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.692442 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.694956 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.711935 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.718911 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.740054 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.740512 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.781487 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.827254 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.932628 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.952628 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 09:49:24 crc kubenswrapper[4873]: I0219 09:49:24.994796 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.006557 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.009013 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.124619 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.159312 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.235264 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.333278 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.354836 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.357605 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.397899 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.508470 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.527215 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.767695 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.892661 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.916407 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 09:49:25 crc kubenswrapper[4873]: I0219 09:49:25.943327 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.053007 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.230442 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.413967 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.435527 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.456669 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.485655 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.486356 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.525627 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.601036 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.697794 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.699373 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.713666 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.806824 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.820858 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.883401 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.909205 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 09:49:26 crc kubenswrapper[4873]: I0219 09:49:26.932592 4873 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.075071 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.083264 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.115396 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.223171 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.259637 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.272169 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.333354 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.339023 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.389091 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.423253 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.528453 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.537965 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.563928 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.669907 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.719881 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.739228 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.761569 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.787113 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.798345 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.818253 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 09:49:27 crc kubenswrapper[4873]: I0219 09:49:27.906664 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.151169 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.227341 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.273025 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.287347 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.471898 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.486996 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.489603 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.492181 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.519675 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.533039 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.649337 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.678527 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.706856 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.741223 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.786027 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 09:49:28 crc kubenswrapper[4873]: I0219 09:49:28.947325 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.004245 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.027976 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.031382 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.031843 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.166722 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.201341 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.267385 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.297326 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.395388 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.443414 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.487801 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.495916 4873 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.507142 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.522796 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.594628 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.666349 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.740784 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.902991 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 09:49:29 crc kubenswrapper[4873]: I0219 09:49:29.998827 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.010828 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.065683 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.071372 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.142831 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.230496 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.304665 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.346835 4873 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.347051 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1cd15f918dfe991d39aa6c783668167ec9ec210784e46772d7b52f35e68404c5" gracePeriod=5 Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.396511 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.415415 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.474725 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.518585 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.542446 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.590243 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.705786 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.845190 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.904495 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.918491 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.925217 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 09:49:30 crc kubenswrapper[4873]: I0219 09:49:30.991630 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.008496 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.011651 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.026156 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.096184 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.098037 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.119620 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.244061 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.278292 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.361009 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.430217 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.483059 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.534637 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.692504 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.716857 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 09:49:31 crc kubenswrapper[4873]: I0219 09:49:31.976621 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.027126 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.035736 4873 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.201595 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.540682 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.551088 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.574893 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.632600 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.646355 4873 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.674772 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.723539 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.767289 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.815568 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.860774 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.875817 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 09:49:32 crc kubenswrapper[4873]: I0219 09:49:32.965285 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 09:49:33 crc kubenswrapper[4873]: I0219 09:49:33.128509 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 09:49:33 crc kubenswrapper[4873]: I0219 09:49:33.133360 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 09:49:33 crc kubenswrapper[4873]: I0219 09:49:33.146647 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 09:49:33 crc kubenswrapper[4873]: I0219 09:49:33.150514 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 09:49:33 crc kubenswrapper[4873]: I0219 09:49:33.304898 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 09:49:33 crc kubenswrapper[4873]: I0219 09:49:33.337577 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 09:49:33 crc kubenswrapper[4873]: I0219 09:49:33.346667 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 09:49:33 crc kubenswrapper[4873]: I0219 09:49:33.440678 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 09:49:33 crc kubenswrapper[4873]: I0219 09:49:33.505929 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 09:49:33 crc kubenswrapper[4873]: I0219 09:49:33.968156 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 09:49:35 crc kubenswrapper[4873]: I0219 09:49:35.809499 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 09:49:35 crc kubenswrapper[4873]: I0219 09:49:35.809805 4873 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1cd15f918dfe991d39aa6c783668167ec9ec210784e46772d7b52f35e68404c5" exitCode=137 Feb 19 09:49:35 crc kubenswrapper[4873]: I0219 09:49:35.917414 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 09:49:35 crc kubenswrapper[4873]: I0219 09:49:35.917615 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.061808 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.061949 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.061947 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.062010 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.062041 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.062056 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.062158 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.062165 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.062166 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.062455 4873 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.062482 4873 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.062501 4873 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.062521 4873 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.070621 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.163443 4873 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.817808 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.817879 4873 scope.go:117] "RemoveContainer" containerID="1cd15f918dfe991d39aa6c783668167ec9ec210784e46772d7b52f35e68404c5" Feb 19 09:49:36 crc kubenswrapper[4873]: I0219 09:49:36.818070 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 09:49:37 crc kubenswrapper[4873]: I0219 09:49:37.495559 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 19 09:49:51 crc kubenswrapper[4873]: I0219 09:49:51.183372 4873 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 19 09:50:02 crc kubenswrapper[4873]: I0219 09:50:02.738071 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qvxgz"] Feb 19 09:50:02 crc kubenswrapper[4873]: I0219 09:50:02.738880 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" podUID="c8c1d3a6-23fd-4526-8892-0add23b09a9a" containerName="controller-manager" containerID="cri-o://ca0ba083f2d897c6b2f519cbc9b73b7e76a6575165553e074c67d17692757e96" gracePeriod=30 Feb 19 09:50:02 crc kubenswrapper[4873]: I0219 09:50:02.741504 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp"] Feb 19 09:50:02 crc kubenswrapper[4873]: I0219 09:50:02.741708 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" podUID="bd468f98-7720-4f9a-972f-684b96f4f90f" containerName="route-controller-manager" containerID="cri-o://444f0cf49cee0a425b45e35836ef80b67c64300d173966d869b2bf6f32c4f2d2" gracePeriod=30 Feb 19 09:50:02 crc kubenswrapper[4873]: I0219 09:50:02.752191 4873 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qltqp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 19 09:50:02 crc kubenswrapper[4873]: I0219 09:50:02.752246 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" podUID="bd468f98-7720-4f9a-972f-684b96f4f90f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 19 09:50:02 crc kubenswrapper[4873]: I0219 09:50:02.974799 4873 generic.go:334] "Generic (PLEG): container finished" podID="c8c1d3a6-23fd-4526-8892-0add23b09a9a" containerID="ca0ba083f2d897c6b2f519cbc9b73b7e76a6575165553e074c67d17692757e96" exitCode=0 Feb 19 09:50:02 crc kubenswrapper[4873]: I0219 09:50:02.974864 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" event={"ID":"c8c1d3a6-23fd-4526-8892-0add23b09a9a","Type":"ContainerDied","Data":"ca0ba083f2d897c6b2f519cbc9b73b7e76a6575165553e074c67d17692757e96"} Feb 19 09:50:02 crc kubenswrapper[4873]: I0219 09:50:02.976710 4873 generic.go:334] "Generic (PLEG): container finished" podID="bd468f98-7720-4f9a-972f-684b96f4f90f" containerID="444f0cf49cee0a425b45e35836ef80b67c64300d173966d869b2bf6f32c4f2d2" exitCode=0 Feb 19 09:50:02 crc kubenswrapper[4873]: I0219 09:50:02.976755 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" event={"ID":"bd468f98-7720-4f9a-972f-684b96f4f90f","Type":"ContainerDied","Data":"444f0cf49cee0a425b45e35836ef80b67c64300d173966d869b2bf6f32c4f2d2"} Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.075221 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.080002 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.176079 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c1d3a6-23fd-4526-8892-0add23b09a9a-serving-cert\") pod \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.176152 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmb7k\" (UniqueName: \"kubernetes.io/projected/c8c1d3a6-23fd-4526-8892-0add23b09a9a-kube-api-access-mmb7k\") pod \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.176201 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-proxy-ca-bundles\") pod \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.176229 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-config\") pod \"bd468f98-7720-4f9a-972f-684b96f4f90f\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.176285 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-client-ca\") pod \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.176315 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd468f98-7720-4f9a-972f-684b96f4f90f-serving-cert\") pod \"bd468f98-7720-4f9a-972f-684b96f4f90f\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.176337 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-config\") pod \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\" (UID: \"c8c1d3a6-23fd-4526-8892-0add23b09a9a\") " Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.176929 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c8c1d3a6-23fd-4526-8892-0add23b09a9a" (UID: "c8c1d3a6-23fd-4526-8892-0add23b09a9a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.176945 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-config" (OuterVolumeSpecName: "config") pod "bd468f98-7720-4f9a-972f-684b96f4f90f" (UID: "bd468f98-7720-4f9a-972f-684b96f4f90f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.176987 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-config" (OuterVolumeSpecName: "config") pod "c8c1d3a6-23fd-4526-8892-0add23b09a9a" (UID: "c8c1d3a6-23fd-4526-8892-0add23b09a9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.177052 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll4wz\" (UniqueName: \"kubernetes.io/projected/bd468f98-7720-4f9a-972f-684b96f4f90f-kube-api-access-ll4wz\") pod \"bd468f98-7720-4f9a-972f-684b96f4f90f\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.177071 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-client-ca\") pod \"bd468f98-7720-4f9a-972f-684b96f4f90f\" (UID: \"bd468f98-7720-4f9a-972f-684b96f4f90f\") " Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.177507 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-client-ca" (OuterVolumeSpecName: "client-ca") pod "c8c1d3a6-23fd-4526-8892-0add23b09a9a" (UID: "c8c1d3a6-23fd-4526-8892-0add23b09a9a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.177720 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-client-ca" (OuterVolumeSpecName: "client-ca") pod "bd468f98-7720-4f9a-972f-684b96f4f90f" (UID: "bd468f98-7720-4f9a-972f-684b96f4f90f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.177865 4873 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.177883 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.177903 4873 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.177916 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c1d3a6-23fd-4526-8892-0add23b09a9a-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.177929 4873 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd468f98-7720-4f9a-972f-684b96f4f90f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.182753 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8c1d3a6-23fd-4526-8892-0add23b09a9a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c8c1d3a6-23fd-4526-8892-0add23b09a9a" (UID: "c8c1d3a6-23fd-4526-8892-0add23b09a9a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.190570 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd468f98-7720-4f9a-972f-684b96f4f90f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bd468f98-7720-4f9a-972f-684b96f4f90f" (UID: "bd468f98-7720-4f9a-972f-684b96f4f90f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.191375 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c1d3a6-23fd-4526-8892-0add23b09a9a-kube-api-access-mmb7k" (OuterVolumeSpecName: "kube-api-access-mmb7k") pod "c8c1d3a6-23fd-4526-8892-0add23b09a9a" (UID: "c8c1d3a6-23fd-4526-8892-0add23b09a9a"). InnerVolumeSpecName "kube-api-access-mmb7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.192064 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd468f98-7720-4f9a-972f-684b96f4f90f-kube-api-access-ll4wz" (OuterVolumeSpecName: "kube-api-access-ll4wz") pod "bd468f98-7720-4f9a-972f-684b96f4f90f" (UID: "bd468f98-7720-4f9a-972f-684b96f4f90f"). InnerVolumeSpecName "kube-api-access-ll4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.279506 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd468f98-7720-4f9a-972f-684b96f4f90f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.279541 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll4wz\" (UniqueName: \"kubernetes.io/projected/bd468f98-7720-4f9a-972f-684b96f4f90f-kube-api-access-ll4wz\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.279556 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c1d3a6-23fd-4526-8892-0add23b09a9a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.279568 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmb7k\" (UniqueName: \"kubernetes.io/projected/c8c1d3a6-23fd-4526-8892-0add23b09a9a-kube-api-access-mmb7k\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.982457 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.983053 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qvxgz" event={"ID":"c8c1d3a6-23fd-4526-8892-0add23b09a9a","Type":"ContainerDied","Data":"3919526da5da79321b05444b65501cd491975ca30007c3620a85b734545d5c95"} Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.983090 4873 scope.go:117] "RemoveContainer" containerID="ca0ba083f2d897c6b2f519cbc9b73b7e76a6575165553e074c67d17692757e96" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.986086 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" event={"ID":"bd468f98-7720-4f9a-972f-684b96f4f90f","Type":"ContainerDied","Data":"00d137182546ceb731d1231ff4489ff44e56001f5469f15e0d3bd78dd28af61d"} Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.986192 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp" Feb 19 09:50:03 crc kubenswrapper[4873]: I0219 09:50:03.998911 4873 scope.go:117] "RemoveContainer" containerID="444f0cf49cee0a425b45e35836ef80b67c64300d173966d869b2bf6f32c4f2d2" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.003703 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qvxgz"] Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.008567 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qvxgz"] Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.018700 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp"] Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.022030 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qltqp"] Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.053996 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c"] Feb 19 09:50:04 crc kubenswrapper[4873]: E0219 09:50:04.054293 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" containerName="installer" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.054308 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" containerName="installer" Feb 19 09:50:04 crc kubenswrapper[4873]: E0219 09:50:04.054318 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.054323 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 09:50:04 crc kubenswrapper[4873]: E0219 09:50:04.054331 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c1d3a6-23fd-4526-8892-0add23b09a9a" containerName="controller-manager" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.054338 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c1d3a6-23fd-4526-8892-0add23b09a9a" containerName="controller-manager" Feb 19 09:50:04 crc kubenswrapper[4873]: E0219 09:50:04.054353 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd468f98-7720-4f9a-972f-684b96f4f90f" containerName="route-controller-manager" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.054358 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd468f98-7720-4f9a-972f-684b96f4f90f" containerName="route-controller-manager" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.054454 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c1d3a6-23fd-4526-8892-0add23b09a9a" containerName="controller-manager" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.054464 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd468f98-7720-4f9a-972f-684b96f4f90f" containerName="route-controller-manager" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.054475 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd45a6e-fa80-4995-bab8-20796784d618" containerName="installer" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.054485 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.054853 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.055882 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-846b877c48-6mqsf"] Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.056504 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.058575 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.058745 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.058927 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.058956 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.062969 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.063505 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.065042 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c"] Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.065880 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.065987 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.065928 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.066050 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.066785 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.071379 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.071880 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.073057 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-846b877c48-6mqsf"] Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.087698 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c217268d-be2f-4eca-b26d-f6659ef4c9ce-serving-cert\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.087729 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-serving-cert\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.087750 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhlsg\" (UniqueName: \"kubernetes.io/projected/c217268d-be2f-4eca-b26d-f6659ef4c9ce-kube-api-access-lhlsg\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.087770 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-config\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.087790 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-client-ca\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.087810 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwdgh\" (UniqueName: \"kubernetes.io/projected/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-kube-api-access-vwdgh\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.087828 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-client-ca\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.087849 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-config\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.087885 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-proxy-ca-bundles\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.125608 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-846b877c48-6mqsf"] Feb 19 09:50:04 crc kubenswrapper[4873]: E0219 09:50:04.125939 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-lhlsg proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" podUID="c217268d-be2f-4eca-b26d-f6659ef4c9ce" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.133820 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c"] Feb 19 09:50:04 crc kubenswrapper[4873]: E0219 09:50:04.134201 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-vwdgh serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" podUID="cd6c625f-090c-449a-97c8-d67aa7a5ea3b" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.188639 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-serving-cert\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.188675 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhlsg\" (UniqueName: \"kubernetes.io/projected/c217268d-be2f-4eca-b26d-f6659ef4c9ce-kube-api-access-lhlsg\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.188706 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-config\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.188736 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-client-ca\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.188762 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwdgh\" (UniqueName: \"kubernetes.io/projected/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-kube-api-access-vwdgh\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.188792 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-client-ca\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.188821 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-config\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.188868 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-proxy-ca-bundles\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.188891 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c217268d-be2f-4eca-b26d-f6659ef4c9ce-serving-cert\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.190203 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-proxy-ca-bundles\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.190221 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-client-ca\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.190600 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-config\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.190654 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-client-ca\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.191081 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-config\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.192672 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c217268d-be2f-4eca-b26d-f6659ef4c9ce-serving-cert\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.198728 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-serving-cert\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.206130 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwdgh\" (UniqueName: \"kubernetes.io/projected/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-kube-api-access-vwdgh\") pod \"route-controller-manager-595695d48d-qrq4c\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:04 crc kubenswrapper[4873]: I0219 09:50:04.206949 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhlsg\" (UniqueName: \"kubernetes.io/projected/c217268d-be2f-4eca-b26d-f6659ef4c9ce-kube-api-access-lhlsg\") pod \"controller-manager-846b877c48-6mqsf\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:04.999963 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.000036 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.010955 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.019013 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.199308 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-proxy-ca-bundles\") pod \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.199373 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-config\") pod \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.199417 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhlsg\" (UniqueName: \"kubernetes.io/projected/c217268d-be2f-4eca-b26d-f6659ef4c9ce-kube-api-access-lhlsg\") pod \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.199461 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-config\") pod \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.199480 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-client-ca\") pod \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.199494 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwdgh\" (UniqueName: \"kubernetes.io/projected/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-kube-api-access-vwdgh\") pod \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.199519 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-serving-cert\") pod \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.199536 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c217268d-be2f-4eca-b26d-f6659ef4c9ce-serving-cert\") pod \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\" (UID: \"c217268d-be2f-4eca-b26d-f6659ef4c9ce\") " Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.199565 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-client-ca\") pod \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\" (UID: \"cd6c625f-090c-449a-97c8-d67aa7a5ea3b\") " Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.199963 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-config" (OuterVolumeSpecName: "config") pod "cd6c625f-090c-449a-97c8-d67aa7a5ea3b" (UID: "cd6c625f-090c-449a-97c8-d67aa7a5ea3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.200151 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-client-ca" (OuterVolumeSpecName: "client-ca") pod "cd6c625f-090c-449a-97c8-d67aa7a5ea3b" (UID: "cd6c625f-090c-449a-97c8-d67aa7a5ea3b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.200343 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c217268d-be2f-4eca-b26d-f6659ef4c9ce" (UID: "c217268d-be2f-4eca-b26d-f6659ef4c9ce"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.200455 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-client-ca" (OuterVolumeSpecName: "client-ca") pod "c217268d-be2f-4eca-b26d-f6659ef4c9ce" (UID: "c217268d-be2f-4eca-b26d-f6659ef4c9ce"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.200898 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-config" (OuterVolumeSpecName: "config") pod "c217268d-be2f-4eca-b26d-f6659ef4c9ce" (UID: "c217268d-be2f-4eca-b26d-f6659ef4c9ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.203459 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-kube-api-access-vwdgh" (OuterVolumeSpecName: "kube-api-access-vwdgh") pod "cd6c625f-090c-449a-97c8-d67aa7a5ea3b" (UID: "cd6c625f-090c-449a-97c8-d67aa7a5ea3b"). InnerVolumeSpecName "kube-api-access-vwdgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.203960 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c217268d-be2f-4eca-b26d-f6659ef4c9ce-kube-api-access-lhlsg" (OuterVolumeSpecName: "kube-api-access-lhlsg") pod "c217268d-be2f-4eca-b26d-f6659ef4c9ce" (UID: "c217268d-be2f-4eca-b26d-f6659ef4c9ce"). InnerVolumeSpecName "kube-api-access-lhlsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.205278 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c217268d-be2f-4eca-b26d-f6659ef4c9ce-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c217268d-be2f-4eca-b26d-f6659ef4c9ce" (UID: "c217268d-be2f-4eca-b26d-f6659ef4c9ce"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.216229 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cd6c625f-090c-449a-97c8-d67aa7a5ea3b" (UID: "cd6c625f-090c-449a-97c8-d67aa7a5ea3b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.301326 4873 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.301921 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.302087 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhlsg\" (UniqueName: \"kubernetes.io/projected/c217268d-be2f-4eca-b26d-f6659ef4c9ce-kube-api-access-lhlsg\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.302368 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.302547 4873 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c217268d-be2f-4eca-b26d-f6659ef4c9ce-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.302712 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwdgh\" (UniqueName: \"kubernetes.io/projected/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-kube-api-access-vwdgh\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.303012 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.303257 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c217268d-be2f-4eca-b26d-f6659ef4c9ce-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.303455 4873 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd6c625f-090c-449a-97c8-d67aa7a5ea3b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.495754 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd468f98-7720-4f9a-972f-684b96f4f90f" path="/var/lib/kubelet/pods/bd468f98-7720-4f9a-972f-684b96f4f90f/volumes" Feb 19 09:50:05 crc kubenswrapper[4873]: I0219 09:50:05.496765 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8c1d3a6-23fd-4526-8892-0add23b09a9a" path="/var/lib/kubelet/pods/c8c1d3a6-23fd-4526-8892-0add23b09a9a/volumes" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.009666 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.010218 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-846b877c48-6mqsf" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.063449 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-846b877c48-6mqsf"] Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.074244 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb"] Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.075698 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.078760 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.079035 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.082507 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-846b877c48-6mqsf"] Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.088551 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb"] Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.089434 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.096791 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.096829 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.096918 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.098750 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c"] Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.099229 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.102740 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595695d48d-qrq4c"] Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.117331 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f3c0e1-ae96-4845-bafc-25bf413d357b-serving-cert\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.117388 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltkgk\" (UniqueName: \"kubernetes.io/projected/b6f3c0e1-ae96-4845-bafc-25bf413d357b-kube-api-access-ltkgk\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.117412 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-proxy-ca-bundles\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.117523 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-client-ca\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.117544 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-config\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.218286 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f3c0e1-ae96-4845-bafc-25bf413d357b-serving-cert\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.218365 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltkgk\" (UniqueName: \"kubernetes.io/projected/b6f3c0e1-ae96-4845-bafc-25bf413d357b-kube-api-access-ltkgk\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.218397 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-proxy-ca-bundles\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.218421 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-client-ca\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.218444 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-config\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.219747 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-client-ca\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.220223 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-config\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.220937 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-proxy-ca-bundles\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.224122 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f3c0e1-ae96-4845-bafc-25bf413d357b-serving-cert\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.246708 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltkgk\" (UniqueName: \"kubernetes.io/projected/b6f3c0e1-ae96-4845-bafc-25bf413d357b-kube-api-access-ltkgk\") pod \"controller-manager-7f6bd8fd79-7bjjb\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.409499 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:06 crc kubenswrapper[4873]: I0219 09:50:06.615591 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb"] Feb 19 09:50:06 crc kubenswrapper[4873]: W0219 09:50:06.629328 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6f3c0e1_ae96_4845_bafc_25bf413d357b.slice/crio-6f762e3ceb0d2e022ff5de2625a5e24c06e6239d60aae50fd9e175814ed573e9 WatchSource:0}: Error finding container 6f762e3ceb0d2e022ff5de2625a5e24c06e6239d60aae50fd9e175814ed573e9: Status 404 returned error can't find the container with id 6f762e3ceb0d2e022ff5de2625a5e24c06e6239d60aae50fd9e175814ed573e9 Feb 19 09:50:07 crc kubenswrapper[4873]: I0219 09:50:07.015801 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" event={"ID":"b6f3c0e1-ae96-4845-bafc-25bf413d357b","Type":"ContainerStarted","Data":"43f41ad0522c975ab5de8d4b7a10f731cd1ee469d6711e1f806948b3c65b26a6"} Feb 19 09:50:07 crc kubenswrapper[4873]: I0219 09:50:07.016210 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" event={"ID":"b6f3c0e1-ae96-4845-bafc-25bf413d357b","Type":"ContainerStarted","Data":"6f762e3ceb0d2e022ff5de2625a5e24c06e6239d60aae50fd9e175814ed573e9"} Feb 19 09:50:07 crc kubenswrapper[4873]: I0219 09:50:07.017600 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:07 crc kubenswrapper[4873]: I0219 09:50:07.034249 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:07 crc kubenswrapper[4873]: I0219 09:50:07.070921 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" podStartSLOduration=3.070901876 podStartE2EDuration="3.070901876s" podCreationTimestamp="2026-02-19 09:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:50:07.043394156 +0000 UTC m=+316.332825804" watchObservedRunningTime="2026-02-19 09:50:07.070901876 +0000 UTC m=+316.360333514" Feb 19 09:50:07 crc kubenswrapper[4873]: I0219 09:50:07.495981 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c217268d-be2f-4eca-b26d-f6659ef4c9ce" path="/var/lib/kubelet/pods/c217268d-be2f-4eca-b26d-f6659ef4c9ce/volumes" Feb 19 09:50:07 crc kubenswrapper[4873]: I0219 09:50:07.496627 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd6c625f-090c-449a-97c8-d67aa7a5ea3b" path="/var/lib/kubelet/pods/cd6c625f-090c-449a-97c8-d67aa7a5ea3b/volumes" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.058223 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x"] Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.059212 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.062214 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.062334 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.062382 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.063070 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.063092 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.065955 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.074847 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x"] Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.243885 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-client-ca\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.243992 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-config\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.244032 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kzg9\" (UniqueName: \"kubernetes.io/projected/3db2587a-f66b-4e3e-855f-9973e9b28743-kube-api-access-2kzg9\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.244095 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db2587a-f66b-4e3e-855f-9973e9b28743-serving-cert\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.345799 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-config\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.345929 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kzg9\" (UniqueName: \"kubernetes.io/projected/3db2587a-f66b-4e3e-855f-9973e9b28743-kube-api-access-2kzg9\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.346014 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db2587a-f66b-4e3e-855f-9973e9b28743-serving-cert\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.346092 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-client-ca\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.347320 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-client-ca\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.348356 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-config\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.354976 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db2587a-f66b-4e3e-855f-9973e9b28743-serving-cert\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.370328 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kzg9\" (UniqueName: \"kubernetes.io/projected/3db2587a-f66b-4e3e-855f-9973e9b28743-kube-api-access-2kzg9\") pod \"route-controller-manager-6685f4fd5b-cmw9x\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.389134 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:08 crc kubenswrapper[4873]: I0219 09:50:08.593693 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x"] Feb 19 09:50:09 crc kubenswrapper[4873]: I0219 09:50:09.030150 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" event={"ID":"3db2587a-f66b-4e3e-855f-9973e9b28743","Type":"ContainerStarted","Data":"bc6dcfd23752a86e6e1bb6f7dd9d0bc1aa50316ea10ce09d6252d4ba5b4e6b9f"} Feb 19 09:50:09 crc kubenswrapper[4873]: I0219 09:50:09.030206 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" event={"ID":"3db2587a-f66b-4e3e-855f-9973e9b28743","Type":"ContainerStarted","Data":"456ae351a251d151fea49e6f19e6eb9dec882c42d7b5599fb86ab622c2053df9"} Feb 19 09:50:09 crc kubenswrapper[4873]: I0219 09:50:09.030367 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:09 crc kubenswrapper[4873]: I0219 09:50:09.053634 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" podStartSLOduration=5.053609115 podStartE2EDuration="5.053609115s" podCreationTimestamp="2026-02-19 09:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:50:09.051084672 +0000 UTC m=+318.340516340" watchObservedRunningTime="2026-02-19 09:50:09.053609115 +0000 UTC m=+318.343040793" Feb 19 09:50:09 crc kubenswrapper[4873]: I0219 09:50:09.098010 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:50:22 crc kubenswrapper[4873]: I0219 09:50:22.741588 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb"] Feb 19 09:50:22 crc kubenswrapper[4873]: I0219 09:50:22.742480 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" podUID="b6f3c0e1-ae96-4845-bafc-25bf413d357b" containerName="controller-manager" containerID="cri-o://43f41ad0522c975ab5de8d4b7a10f731cd1ee469d6711e1f806948b3c65b26a6" gracePeriod=30 Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.126706 4873 generic.go:334] "Generic (PLEG): container finished" podID="b6f3c0e1-ae96-4845-bafc-25bf413d357b" containerID="43f41ad0522c975ab5de8d4b7a10f731cd1ee469d6711e1f806948b3c65b26a6" exitCode=0 Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.126829 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" event={"ID":"b6f3c0e1-ae96-4845-bafc-25bf413d357b","Type":"ContainerDied","Data":"43f41ad0522c975ab5de8d4b7a10f731cd1ee469d6711e1f806948b3c65b26a6"} Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.261678 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.456407 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f3c0e1-ae96-4845-bafc-25bf413d357b-serving-cert\") pod \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.456660 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-client-ca\") pod \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.456692 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-config\") pod \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.456716 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-proxy-ca-bundles\") pod \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.456761 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltkgk\" (UniqueName: \"kubernetes.io/projected/b6f3c0e1-ae96-4845-bafc-25bf413d357b-kube-api-access-ltkgk\") pod \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\" (UID: \"b6f3c0e1-ae96-4845-bafc-25bf413d357b\") " Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.457533 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-client-ca" (OuterVolumeSpecName: "client-ca") pod "b6f3c0e1-ae96-4845-bafc-25bf413d357b" (UID: "b6f3c0e1-ae96-4845-bafc-25bf413d357b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.457546 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b6f3c0e1-ae96-4845-bafc-25bf413d357b" (UID: "b6f3c0e1-ae96-4845-bafc-25bf413d357b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.457589 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-config" (OuterVolumeSpecName: "config") pod "b6f3c0e1-ae96-4845-bafc-25bf413d357b" (UID: "b6f3c0e1-ae96-4845-bafc-25bf413d357b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.462451 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f3c0e1-ae96-4845-bafc-25bf413d357b-kube-api-access-ltkgk" (OuterVolumeSpecName: "kube-api-access-ltkgk") pod "b6f3c0e1-ae96-4845-bafc-25bf413d357b" (UID: "b6f3c0e1-ae96-4845-bafc-25bf413d357b"). InnerVolumeSpecName "kube-api-access-ltkgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.462638 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f3c0e1-ae96-4845-bafc-25bf413d357b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b6f3c0e1-ae96-4845-bafc-25bf413d357b" (UID: "b6f3c0e1-ae96-4845-bafc-25bf413d357b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.558632 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.558687 4873 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.558707 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltkgk\" (UniqueName: \"kubernetes.io/projected/b6f3c0e1-ae96-4845-bafc-25bf413d357b-kube-api-access-ltkgk\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.558721 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f3c0e1-ae96-4845-bafc-25bf413d357b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:23 crc kubenswrapper[4873]: I0219 09:50:23.558734 4873 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6f3c0e1-ae96-4845-bafc-25bf413d357b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.070365 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-846b877c48-4bs89"] Feb 19 09:50:24 crc kubenswrapper[4873]: E0219 09:50:24.070694 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f3c0e1-ae96-4845-bafc-25bf413d357b" containerName="controller-manager" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.070715 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f3c0e1-ae96-4845-bafc-25bf413d357b" containerName="controller-manager" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.070869 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f3c0e1-ae96-4845-bafc-25bf413d357b" containerName="controller-manager" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.071467 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.085084 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-846b877c48-4bs89"] Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.134868 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" event={"ID":"b6f3c0e1-ae96-4845-bafc-25bf413d357b","Type":"ContainerDied","Data":"6f762e3ceb0d2e022ff5de2625a5e24c06e6239d60aae50fd9e175814ed573e9"} Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.134925 4873 scope.go:117] "RemoveContainer" containerID="43f41ad0522c975ab5de8d4b7a10f731cd1ee469d6711e1f806948b3c65b26a6" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.135039 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.159624 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb"] Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.167959 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f6bd8fd79-7bjjb"] Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.168732 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8af9160-21ef-4a41-8ffe-513930b969d0-serving-cert\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.168801 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46jnl\" (UniqueName: \"kubernetes.io/projected/d8af9160-21ef-4a41-8ffe-513930b969d0-kube-api-access-46jnl\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.168847 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8af9160-21ef-4a41-8ffe-513930b969d0-client-ca\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.168971 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8af9160-21ef-4a41-8ffe-513930b969d0-proxy-ca-bundles\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.169013 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8af9160-21ef-4a41-8ffe-513930b969d0-config\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.270443 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8af9160-21ef-4a41-8ffe-513930b969d0-serving-cert\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.270866 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8af9160-21ef-4a41-8ffe-513930b969d0-client-ca\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.270903 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46jnl\" (UniqueName: \"kubernetes.io/projected/d8af9160-21ef-4a41-8ffe-513930b969d0-kube-api-access-46jnl\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.271064 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8af9160-21ef-4a41-8ffe-513930b969d0-proxy-ca-bundles\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.271130 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8af9160-21ef-4a41-8ffe-513930b969d0-config\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.272545 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8af9160-21ef-4a41-8ffe-513930b969d0-client-ca\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.272791 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8af9160-21ef-4a41-8ffe-513930b969d0-proxy-ca-bundles\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.274811 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8af9160-21ef-4a41-8ffe-513930b969d0-config\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.277379 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8af9160-21ef-4a41-8ffe-513930b969d0-serving-cert\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.295199 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46jnl\" (UniqueName: \"kubernetes.io/projected/d8af9160-21ef-4a41-8ffe-513930b969d0-kube-api-access-46jnl\") pod \"controller-manager-846b877c48-4bs89\" (UID: \"d8af9160-21ef-4a41-8ffe-513930b969d0\") " pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.398022 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:24 crc kubenswrapper[4873]: I0219 09:50:24.987697 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-846b877c48-4bs89"] Feb 19 09:50:25 crc kubenswrapper[4873]: I0219 09:50:25.141579 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" event={"ID":"d8af9160-21ef-4a41-8ffe-513930b969d0","Type":"ContainerStarted","Data":"b4ec1545226012a614754f2e26b31dc61ea2543f74b40eab1ebee163b1c0b92a"} Feb 19 09:50:25 crc kubenswrapper[4873]: I0219 09:50:25.141982 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:25 crc kubenswrapper[4873]: I0219 09:50:25.141996 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" event={"ID":"d8af9160-21ef-4a41-8ffe-513930b969d0","Type":"ContainerStarted","Data":"ab475893ee1c93f0632441d26257315871e2f293d6676cdd8efdf0b34cc52214"} Feb 19 09:50:25 crc kubenswrapper[4873]: I0219 09:50:25.149175 4873 patch_prober.go:28] interesting pod/controller-manager-846b877c48-4bs89 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Feb 19 09:50:25 crc kubenswrapper[4873]: I0219 09:50:25.149216 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" podUID="d8af9160-21ef-4a41-8ffe-513930b969d0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Feb 19 09:50:25 crc kubenswrapper[4873]: I0219 09:50:25.160906 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" podStartSLOduration=3.160887639 podStartE2EDuration="3.160887639s" podCreationTimestamp="2026-02-19 09:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:50:25.158257073 +0000 UTC m=+334.447688711" watchObservedRunningTime="2026-02-19 09:50:25.160887639 +0000 UTC m=+334.450319277" Feb 19 09:50:25 crc kubenswrapper[4873]: I0219 09:50:25.496744 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f3c0e1-ae96-4845-bafc-25bf413d357b" path="/var/lib/kubelet/pods/b6f3c0e1-ae96-4845-bafc-25bf413d357b/volumes" Feb 19 09:50:26 crc kubenswrapper[4873]: I0219 09:50:26.154438 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-846b877c48-4bs89" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.292881 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zk9wc"] Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.294338 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.303164 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.312264 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zk9wc"] Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.415276 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f466b31-21ca-4f19-9b73-72cfb7c68d55-utilities\") pod \"community-operators-zk9wc\" (UID: \"5f466b31-21ca-4f19-9b73-72cfb7c68d55\") " pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.415356 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8kfj\" (UniqueName: \"kubernetes.io/projected/5f466b31-21ca-4f19-9b73-72cfb7c68d55-kube-api-access-j8kfj\") pod \"community-operators-zk9wc\" (UID: \"5f466b31-21ca-4f19-9b73-72cfb7c68d55\") " pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.415420 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f466b31-21ca-4f19-9b73-72cfb7c68d55-catalog-content\") pod \"community-operators-zk9wc\" (UID: \"5f466b31-21ca-4f19-9b73-72cfb7c68d55\") " pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.516606 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f466b31-21ca-4f19-9b73-72cfb7c68d55-utilities\") pod \"community-operators-zk9wc\" (UID: \"5f466b31-21ca-4f19-9b73-72cfb7c68d55\") " pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.516674 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8kfj\" (UniqueName: \"kubernetes.io/projected/5f466b31-21ca-4f19-9b73-72cfb7c68d55-kube-api-access-j8kfj\") pod \"community-operators-zk9wc\" (UID: \"5f466b31-21ca-4f19-9b73-72cfb7c68d55\") " pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.516702 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f466b31-21ca-4f19-9b73-72cfb7c68d55-catalog-content\") pod \"community-operators-zk9wc\" (UID: \"5f466b31-21ca-4f19-9b73-72cfb7c68d55\") " pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.517174 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f466b31-21ca-4f19-9b73-72cfb7c68d55-catalog-content\") pod \"community-operators-zk9wc\" (UID: \"5f466b31-21ca-4f19-9b73-72cfb7c68d55\") " pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.517313 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f466b31-21ca-4f19-9b73-72cfb7c68d55-utilities\") pod \"community-operators-zk9wc\" (UID: \"5f466b31-21ca-4f19-9b73-72cfb7c68d55\") " pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.538955 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8kfj\" (UniqueName: \"kubernetes.io/projected/5f466b31-21ca-4f19-9b73-72cfb7c68d55-kube-api-access-j8kfj\") pod \"community-operators-zk9wc\" (UID: \"5f466b31-21ca-4f19-9b73-72cfb7c68d55\") " pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:40 crc kubenswrapper[4873]: I0219 09:50:40.607858 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:41 crc kubenswrapper[4873]: I0219 09:50:41.015847 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zk9wc"] Feb 19 09:50:41 crc kubenswrapper[4873]: W0219 09:50:41.021369 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f466b31_21ca_4f19_9b73_72cfb7c68d55.slice/crio-37110418ece1ef9b6ea5a86e0c098f88d36ad6c88b57c3d8af76a46e26ee0033 WatchSource:0}: Error finding container 37110418ece1ef9b6ea5a86e0c098f88d36ad6c88b57c3d8af76a46e26ee0033: Status 404 returned error can't find the container with id 37110418ece1ef9b6ea5a86e0c098f88d36ad6c88b57c3d8af76a46e26ee0033 Feb 19 09:50:41 crc kubenswrapper[4873]: I0219 09:50:41.231070 4873 generic.go:334] "Generic (PLEG): container finished" podID="5f466b31-21ca-4f19-9b73-72cfb7c68d55" containerID="235a4cc9de0ce0d9c318ba6b2fc1c727f9e9bc32ac5707fc258224436b9deb00" exitCode=0 Feb 19 09:50:41 crc kubenswrapper[4873]: I0219 09:50:41.231136 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zk9wc" event={"ID":"5f466b31-21ca-4f19-9b73-72cfb7c68d55","Type":"ContainerDied","Data":"235a4cc9de0ce0d9c318ba6b2fc1c727f9e9bc32ac5707fc258224436b9deb00"} Feb 19 09:50:41 crc kubenswrapper[4873]: I0219 09:50:41.231376 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zk9wc" event={"ID":"5f466b31-21ca-4f19-9b73-72cfb7c68d55","Type":"ContainerStarted","Data":"37110418ece1ef9b6ea5a86e0c098f88d36ad6c88b57c3d8af76a46e26ee0033"} Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.236951 4873 generic.go:334] "Generic (PLEG): container finished" podID="5f466b31-21ca-4f19-9b73-72cfb7c68d55" containerID="32511e960070046f73db7a65487a607b53bb5a81f10c6e1cb939ee3d03b1c42f" exitCode=0 Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.237168 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zk9wc" event={"ID":"5f466b31-21ca-4f19-9b73-72cfb7c68d55","Type":"ContainerDied","Data":"32511e960070046f73db7a65487a607b53bb5a81f10c6e1cb939ee3d03b1c42f"} Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.693971 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-prw4c"] Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.695488 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.696820 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.711606 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prw4c"] Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.746778 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cc54252-cfdf-4b71-bfa5-552dcd26500d-catalog-content\") pod \"redhat-operators-prw4c\" (UID: \"4cc54252-cfdf-4b71-bfa5-552dcd26500d\") " pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.746825 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bjq7\" (UniqueName: \"kubernetes.io/projected/4cc54252-cfdf-4b71-bfa5-552dcd26500d-kube-api-access-4bjq7\") pod \"redhat-operators-prw4c\" (UID: \"4cc54252-cfdf-4b71-bfa5-552dcd26500d\") " pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.747018 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cc54252-cfdf-4b71-bfa5-552dcd26500d-utilities\") pod \"redhat-operators-prw4c\" (UID: \"4cc54252-cfdf-4b71-bfa5-552dcd26500d\") " pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.847855 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cc54252-cfdf-4b71-bfa5-552dcd26500d-catalog-content\") pod \"redhat-operators-prw4c\" (UID: \"4cc54252-cfdf-4b71-bfa5-552dcd26500d\") " pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.847901 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bjq7\" (UniqueName: \"kubernetes.io/projected/4cc54252-cfdf-4b71-bfa5-552dcd26500d-kube-api-access-4bjq7\") pod \"redhat-operators-prw4c\" (UID: \"4cc54252-cfdf-4b71-bfa5-552dcd26500d\") " pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.847955 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cc54252-cfdf-4b71-bfa5-552dcd26500d-utilities\") pod \"redhat-operators-prw4c\" (UID: \"4cc54252-cfdf-4b71-bfa5-552dcd26500d\") " pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.848481 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cc54252-cfdf-4b71-bfa5-552dcd26500d-utilities\") pod \"redhat-operators-prw4c\" (UID: \"4cc54252-cfdf-4b71-bfa5-552dcd26500d\") " pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.848889 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cc54252-cfdf-4b71-bfa5-552dcd26500d-catalog-content\") pod \"redhat-operators-prw4c\" (UID: \"4cc54252-cfdf-4b71-bfa5-552dcd26500d\") " pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:42 crc kubenswrapper[4873]: I0219 09:50:42.866994 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bjq7\" (UniqueName: \"kubernetes.io/projected/4cc54252-cfdf-4b71-bfa5-552dcd26500d-kube-api-access-4bjq7\") pod \"redhat-operators-prw4c\" (UID: \"4cc54252-cfdf-4b71-bfa5-552dcd26500d\") " pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:43 crc kubenswrapper[4873]: I0219 09:50:43.049807 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:43 crc kubenswrapper[4873]: I0219 09:50:43.247319 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zk9wc" event={"ID":"5f466b31-21ca-4f19-9b73-72cfb7c68d55","Type":"ContainerStarted","Data":"c877ae60364e1ca60cc0eb5b572c2d10ada45db34b84d724f8be386b520c7fe2"} Feb 19 09:50:43 crc kubenswrapper[4873]: I0219 09:50:43.272333 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zk9wc" podStartSLOduration=1.8635431709999999 podStartE2EDuration="3.272308565s" podCreationTimestamp="2026-02-19 09:50:40 +0000 UTC" firstStartedPulling="2026-02-19 09:50:41.233298196 +0000 UTC m=+350.522729844" lastFinishedPulling="2026-02-19 09:50:42.64206357 +0000 UTC m=+351.931495238" observedRunningTime="2026-02-19 09:50:43.26932816 +0000 UTC m=+352.558759838" watchObservedRunningTime="2026-02-19 09:50:43.272308565 +0000 UTC m=+352.561740213" Feb 19 09:50:43 crc kubenswrapper[4873]: I0219 09:50:43.508799 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prw4c"] Feb 19 09:50:44 crc kubenswrapper[4873]: I0219 09:50:44.253383 4873 generic.go:334] "Generic (PLEG): container finished" podID="4cc54252-cfdf-4b71-bfa5-552dcd26500d" containerID="f3f57712b2360a2b482c6d51abd9045fd079835e664170ddd4c3c59343cbfa4f" exitCode=0 Feb 19 09:50:44 crc kubenswrapper[4873]: I0219 09:50:44.253496 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prw4c" event={"ID":"4cc54252-cfdf-4b71-bfa5-552dcd26500d","Type":"ContainerDied","Data":"f3f57712b2360a2b482c6d51abd9045fd079835e664170ddd4c3c59343cbfa4f"} Feb 19 09:50:44 crc kubenswrapper[4873]: I0219 09:50:44.254028 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prw4c" event={"ID":"4cc54252-cfdf-4b71-bfa5-552dcd26500d","Type":"ContainerStarted","Data":"196214af4696ef64a53a09ceae50492efa721dd9677dfaa5fb81d9035acae728"} Feb 19 09:50:46 crc kubenswrapper[4873]: I0219 09:50:46.268311 4873 generic.go:334] "Generic (PLEG): container finished" podID="4cc54252-cfdf-4b71-bfa5-552dcd26500d" containerID="f03d133f183eef9c098e95f93377b7ee5f8e0c45d8583135279394f2b3e5426e" exitCode=0 Feb 19 09:50:46 crc kubenswrapper[4873]: I0219 09:50:46.268404 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prw4c" event={"ID":"4cc54252-cfdf-4b71-bfa5-552dcd26500d","Type":"ContainerDied","Data":"f03d133f183eef9c098e95f93377b7ee5f8e0c45d8583135279394f2b3e5426e"} Feb 19 09:50:47 crc kubenswrapper[4873]: I0219 09:50:47.278953 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prw4c" event={"ID":"4cc54252-cfdf-4b71-bfa5-552dcd26500d","Type":"ContainerStarted","Data":"482af24b132b8069e78985473f493d01541d4588becf63b36629d98342cbc8cb"} Feb 19 09:50:48 crc kubenswrapper[4873]: I0219 09:50:48.240694 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:50:48 crc kubenswrapper[4873]: I0219 09:50:48.240756 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:50:50 crc kubenswrapper[4873]: I0219 09:50:50.608335 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:50 crc kubenswrapper[4873]: I0219 09:50:50.609882 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:50 crc kubenswrapper[4873]: I0219 09:50:50.658152 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:50 crc kubenswrapper[4873]: I0219 09:50:50.678133 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-prw4c" podStartSLOduration=6.29906777 podStartE2EDuration="8.678113724s" podCreationTimestamp="2026-02-19 09:50:42 +0000 UTC" firstStartedPulling="2026-02-19 09:50:44.255003167 +0000 UTC m=+353.544434805" lastFinishedPulling="2026-02-19 09:50:46.634049121 +0000 UTC m=+355.923480759" observedRunningTime="2026-02-19 09:50:47.300705083 +0000 UTC m=+356.590136771" watchObservedRunningTime="2026-02-19 09:50:50.678113724 +0000 UTC m=+359.967545372" Feb 19 09:50:51 crc kubenswrapper[4873]: I0219 09:50:51.348048 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zk9wc" Feb 19 09:50:53 crc kubenswrapper[4873]: I0219 09:50:53.050332 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:53 crc kubenswrapper[4873]: I0219 09:50:53.050383 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:50:54 crc kubenswrapper[4873]: I0219 09:50:54.124895 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-prw4c" podUID="4cc54252-cfdf-4b71-bfa5-552dcd26500d" containerName="registry-server" probeResult="failure" output=< Feb 19 09:50:54 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 09:50:54 crc kubenswrapper[4873]: > Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.650384 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gkkzf"] Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.651505 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.672740 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gkkzf"] Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.747506 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f04829-f740-4d17-9358-f59fa6561eaa-registry-tls\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.747555 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f04829-f740-4d17-9358-f59fa6561eaa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.747592 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbb6c\" (UniqueName: \"kubernetes.io/projected/f4f04829-f740-4d17-9358-f59fa6561eaa-kube-api-access-sbb6c\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.747616 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.747639 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f04829-f740-4d17-9358-f59fa6561eaa-trusted-ca\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.747711 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f04829-f740-4d17-9358-f59fa6561eaa-registry-certificates\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.747727 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f04829-f740-4d17-9358-f59fa6561eaa-bound-sa-token\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.747758 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f04829-f740-4d17-9358-f59fa6561eaa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.772817 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.849625 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbb6c\" (UniqueName: \"kubernetes.io/projected/f4f04829-f740-4d17-9358-f59fa6561eaa-kube-api-access-sbb6c\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.849744 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f04829-f740-4d17-9358-f59fa6561eaa-trusted-ca\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.850284 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f04829-f740-4d17-9358-f59fa6561eaa-registry-certificates\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.850413 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f04829-f740-4d17-9358-f59fa6561eaa-bound-sa-token\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.850544 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f04829-f740-4d17-9358-f59fa6561eaa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.850626 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f04829-f740-4d17-9358-f59fa6561eaa-registry-tls\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.850687 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f04829-f740-4d17-9358-f59fa6561eaa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.851574 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f04829-f740-4d17-9358-f59fa6561eaa-registry-certificates\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.851777 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f04829-f740-4d17-9358-f59fa6561eaa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.852934 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f04829-f740-4d17-9358-f59fa6561eaa-trusted-ca\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.858217 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f04829-f740-4d17-9358-f59fa6561eaa-registry-tls\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.861166 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f04829-f740-4d17-9358-f59fa6561eaa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.871538 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f04829-f740-4d17-9358-f59fa6561eaa-bound-sa-token\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.872348 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbb6c\" (UniqueName: \"kubernetes.io/projected/f4f04829-f740-4d17-9358-f59fa6561eaa-kube-api-access-sbb6c\") pod \"image-registry-66df7c8f76-gkkzf\" (UID: \"f4f04829-f740-4d17-9358-f59fa6561eaa\") " pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:56 crc kubenswrapper[4873]: I0219 09:50:56.970923 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:57 crc kubenswrapper[4873]: I0219 09:50:57.522707 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gkkzf"] Feb 19 09:50:57 crc kubenswrapper[4873]: W0219 09:50:57.526529 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f04829_f740_4d17_9358_f59fa6561eaa.slice/crio-1304e88140e14bf1f928899d3525033eebe6aee5fdb24b96819955c54c64d700 WatchSource:0}: Error finding container 1304e88140e14bf1f928899d3525033eebe6aee5fdb24b96819955c54c64d700: Status 404 returned error can't find the container with id 1304e88140e14bf1f928899d3525033eebe6aee5fdb24b96819955c54c64d700 Feb 19 09:50:58 crc kubenswrapper[4873]: I0219 09:50:58.339666 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" event={"ID":"f4f04829-f740-4d17-9358-f59fa6561eaa","Type":"ContainerStarted","Data":"2ac1c42242b389bf16129558a39272bbf47249f38b3e44908f6b6cf7ca19450e"} Feb 19 09:50:58 crc kubenswrapper[4873]: I0219 09:50:58.340018 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:50:58 crc kubenswrapper[4873]: I0219 09:50:58.340036 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" event={"ID":"f4f04829-f740-4d17-9358-f59fa6561eaa","Type":"ContainerStarted","Data":"1304e88140e14bf1f928899d3525033eebe6aee5fdb24b96819955c54c64d700"} Feb 19 09:50:58 crc kubenswrapper[4873]: I0219 09:50:58.360162 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" podStartSLOduration=2.360146501 podStartE2EDuration="2.360146501s" podCreationTimestamp="2026-02-19 09:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:50:58.359431713 +0000 UTC m=+367.648863371" watchObservedRunningTime="2026-02-19 09:50:58.360146501 +0000 UTC m=+367.649578139" Feb 19 09:51:03 crc kubenswrapper[4873]: I0219 09:51:03.119870 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:51:03 crc kubenswrapper[4873]: I0219 09:51:03.166373 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-prw4c" Feb 19 09:51:16 crc kubenswrapper[4873]: I0219 09:51:16.978307 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-gkkzf" Feb 19 09:51:17 crc kubenswrapper[4873]: I0219 09:51:17.048992 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7hhjq"] Feb 19 09:51:18 crc kubenswrapper[4873]: I0219 09:51:18.240758 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:51:18 crc kubenswrapper[4873]: I0219 09:51:18.241213 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:51:22 crc kubenswrapper[4873]: I0219 09:51:22.748365 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x"] Feb 19 09:51:22 crc kubenswrapper[4873]: I0219 09:51:22.748878 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" podUID="3db2587a-f66b-4e3e-855f-9973e9b28743" containerName="route-controller-manager" containerID="cri-o://bc6dcfd23752a86e6e1bb6f7dd9d0bc1aa50316ea10ce09d6252d4ba5b4e6b9f" gracePeriod=30 Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.491920 4873 generic.go:334] "Generic (PLEG): container finished" podID="3db2587a-f66b-4e3e-855f-9973e9b28743" containerID="bc6dcfd23752a86e6e1bb6f7dd9d0bc1aa50316ea10ce09d6252d4ba5b4e6b9f" exitCode=0 Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.492015 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" event={"ID":"3db2587a-f66b-4e3e-855f-9973e9b28743","Type":"ContainerDied","Data":"bc6dcfd23752a86e6e1bb6f7dd9d0bc1aa50316ea10ce09d6252d4ba5b4e6b9f"} Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.715938 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.814222 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kzg9\" (UniqueName: \"kubernetes.io/projected/3db2587a-f66b-4e3e-855f-9973e9b28743-kube-api-access-2kzg9\") pod \"3db2587a-f66b-4e3e-855f-9973e9b28743\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.814304 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-client-ca\") pod \"3db2587a-f66b-4e3e-855f-9973e9b28743\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.814340 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db2587a-f66b-4e3e-855f-9973e9b28743-serving-cert\") pod \"3db2587a-f66b-4e3e-855f-9973e9b28743\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.814372 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-config\") pod \"3db2587a-f66b-4e3e-855f-9973e9b28743\" (UID: \"3db2587a-f66b-4e3e-855f-9973e9b28743\") " Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.815765 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-client-ca" (OuterVolumeSpecName: "client-ca") pod "3db2587a-f66b-4e3e-855f-9973e9b28743" (UID: "3db2587a-f66b-4e3e-855f-9973e9b28743"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.816324 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-config" (OuterVolumeSpecName: "config") pod "3db2587a-f66b-4e3e-855f-9973e9b28743" (UID: "3db2587a-f66b-4e3e-855f-9973e9b28743"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.828037 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db2587a-f66b-4e3e-855f-9973e9b28743-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3db2587a-f66b-4e3e-855f-9973e9b28743" (UID: "3db2587a-f66b-4e3e-855f-9973e9b28743"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.828059 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db2587a-f66b-4e3e-855f-9973e9b28743-kube-api-access-2kzg9" (OuterVolumeSpecName: "kube-api-access-2kzg9") pod "3db2587a-f66b-4e3e-855f-9973e9b28743" (UID: "3db2587a-f66b-4e3e-855f-9973e9b28743"). InnerVolumeSpecName "kube-api-access-2kzg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.915709 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kzg9\" (UniqueName: \"kubernetes.io/projected/3db2587a-f66b-4e3e-855f-9973e9b28743-kube-api-access-2kzg9\") on node \"crc\" DevicePath \"\"" Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.915753 4873 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.915768 4873 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db2587a-f66b-4e3e-855f-9973e9b28743-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:51:23 crc kubenswrapper[4873]: I0219 09:51:23.915782 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db2587a-f66b-4e3e-855f-9973e9b28743-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.119540 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq"] Feb 19 09:51:24 crc kubenswrapper[4873]: E0219 09:51:24.119931 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db2587a-f66b-4e3e-855f-9973e9b28743" containerName="route-controller-manager" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.119964 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db2587a-f66b-4e3e-855f-9973e9b28743" containerName="route-controller-manager" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.120187 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db2587a-f66b-4e3e-855f-9973e9b28743" containerName="route-controller-manager" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.120844 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.148718 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq"] Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.220336 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fprpl\" (UniqueName: \"kubernetes.io/projected/26cf91f4-9f21-487d-8ee9-23700f39e900-kube-api-access-fprpl\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.220440 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26cf91f4-9f21-487d-8ee9-23700f39e900-client-ca\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.220464 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26cf91f4-9f21-487d-8ee9-23700f39e900-config\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.220545 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26cf91f4-9f21-487d-8ee9-23700f39e900-serving-cert\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.322287 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26cf91f4-9f21-487d-8ee9-23700f39e900-serving-cert\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.322369 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fprpl\" (UniqueName: \"kubernetes.io/projected/26cf91f4-9f21-487d-8ee9-23700f39e900-kube-api-access-fprpl\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.322417 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26cf91f4-9f21-487d-8ee9-23700f39e900-client-ca\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.322442 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26cf91f4-9f21-487d-8ee9-23700f39e900-config\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.324135 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26cf91f4-9f21-487d-8ee9-23700f39e900-client-ca\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.324186 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26cf91f4-9f21-487d-8ee9-23700f39e900-config\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.329568 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26cf91f4-9f21-487d-8ee9-23700f39e900-serving-cert\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.343862 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fprpl\" (UniqueName: \"kubernetes.io/projected/26cf91f4-9f21-487d-8ee9-23700f39e900-kube-api-access-fprpl\") pod \"route-controller-manager-595695d48d-g88dq\" (UID: \"26cf91f4-9f21-487d-8ee9-23700f39e900\") " pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.447288 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.499765 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" event={"ID":"3db2587a-f66b-4e3e-855f-9973e9b28743","Type":"ContainerDied","Data":"456ae351a251d151fea49e6f19e6eb9dec882c42d7b5599fb86ab622c2053df9"} Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.499814 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.499844 4873 scope.go:117] "RemoveContainer" containerID="bc6dcfd23752a86e6e1bb6f7dd9d0bc1aa50316ea10ce09d6252d4ba5b4e6b9f" Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.528090 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x"] Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.531184 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-cmw9x"] Feb 19 09:51:24 crc kubenswrapper[4873]: I0219 09:51:24.662233 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq"] Feb 19 09:51:25 crc kubenswrapper[4873]: I0219 09:51:25.496027 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db2587a-f66b-4e3e-855f-9973e9b28743" path="/var/lib/kubelet/pods/3db2587a-f66b-4e3e-855f-9973e9b28743/volumes" Feb 19 09:51:25 crc kubenswrapper[4873]: I0219 09:51:25.507809 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" event={"ID":"26cf91f4-9f21-487d-8ee9-23700f39e900","Type":"ContainerStarted","Data":"58fec737d644d6a07362ed517738619806fbbf08b24f9d937d2d1a85a0742dd8"} Feb 19 09:51:25 crc kubenswrapper[4873]: I0219 09:51:25.507874 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" event={"ID":"26cf91f4-9f21-487d-8ee9-23700f39e900","Type":"ContainerStarted","Data":"8b7eb58a34f6f670f3aaa5180f495d0f1ad02db6e966a32f0a45dbd24399f695"} Feb 19 09:51:25 crc kubenswrapper[4873]: I0219 09:51:25.509567 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:25 crc kubenswrapper[4873]: I0219 09:51:25.514055 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" Feb 19 09:51:25 crc kubenswrapper[4873]: I0219 09:51:25.536776 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-595695d48d-g88dq" podStartSLOduration=3.536755834 podStartE2EDuration="3.536755834s" podCreationTimestamp="2026-02-19 09:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:51:25.534973099 +0000 UTC m=+394.824404767" watchObservedRunningTime="2026-02-19 09:51:25.536755834 +0000 UTC m=+394.826187482" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.095851 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" podUID="2948a5a7-4d94-4314-acdf-489dd93609b9" containerName="registry" containerID="cri-o://1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc" gracePeriod=30 Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.479730 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.606420 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"2948a5a7-4d94-4314-acdf-489dd93609b9\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.606477 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-certificates\") pod \"2948a5a7-4d94-4314-acdf-489dd93609b9\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.606519 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-tls\") pod \"2948a5a7-4d94-4314-acdf-489dd93609b9\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.606556 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-trusted-ca\") pod \"2948a5a7-4d94-4314-acdf-489dd93609b9\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.606610 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2948a5a7-4d94-4314-acdf-489dd93609b9-ca-trust-extracted\") pod \"2948a5a7-4d94-4314-acdf-489dd93609b9\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.606660 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7wm6\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-kube-api-access-q7wm6\") pod \"2948a5a7-4d94-4314-acdf-489dd93609b9\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.606686 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-bound-sa-token\") pod \"2948a5a7-4d94-4314-acdf-489dd93609b9\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.606714 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2948a5a7-4d94-4314-acdf-489dd93609b9-installation-pull-secrets\") pod \"2948a5a7-4d94-4314-acdf-489dd93609b9\" (UID: \"2948a5a7-4d94-4314-acdf-489dd93609b9\") " Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.607890 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2948a5a7-4d94-4314-acdf-489dd93609b9" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.608642 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2948a5a7-4d94-4314-acdf-489dd93609b9" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.613289 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-kube-api-access-q7wm6" (OuterVolumeSpecName: "kube-api-access-q7wm6") pod "2948a5a7-4d94-4314-acdf-489dd93609b9" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9"). InnerVolumeSpecName "kube-api-access-q7wm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.614064 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2948a5a7-4d94-4314-acdf-489dd93609b9" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.614347 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2948a5a7-4d94-4314-acdf-489dd93609b9" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.617479 4873 generic.go:334] "Generic (PLEG): container finished" podID="2948a5a7-4d94-4314-acdf-489dd93609b9" containerID="1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc" exitCode=0 Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.617525 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" event={"ID":"2948a5a7-4d94-4314-acdf-489dd93609b9","Type":"ContainerDied","Data":"1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc"} Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.617553 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" event={"ID":"2948a5a7-4d94-4314-acdf-489dd93609b9","Type":"ContainerDied","Data":"9186481593e0db9c07ae375e1f7f148954394edd55d25a1feea71b13835f9c08"} Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.617578 4873 scope.go:117] "RemoveContainer" containerID="1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.617695 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-7hhjq" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.622497 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2948a5a7-4d94-4314-acdf-489dd93609b9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2948a5a7-4d94-4314-acdf-489dd93609b9" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.625125 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "2948a5a7-4d94-4314-acdf-489dd93609b9" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.625459 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2948a5a7-4d94-4314-acdf-489dd93609b9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2948a5a7-4d94-4314-acdf-489dd93609b9" (UID: "2948a5a7-4d94-4314-acdf-489dd93609b9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.664909 4873 scope.go:117] "RemoveContainer" containerID="1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc" Feb 19 09:51:42 crc kubenswrapper[4873]: E0219 09:51:42.665571 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc\": container with ID starting with 1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc not found: ID does not exist" containerID="1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.665627 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc"} err="failed to get container status \"1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc\": rpc error: code = NotFound desc = could not find container \"1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc\": container with ID starting with 1996e733635906962b7b2e8b3762e89eeadec10f1b534e11c0d90dd0767471bc not found: ID does not exist" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.708148 4873 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2948a5a7-4d94-4314-acdf-489dd93609b9-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.708193 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7wm6\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-kube-api-access-q7wm6\") on node \"crc\" DevicePath \"\"" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.708208 4873 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.708220 4873 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2948a5a7-4d94-4314-acdf-489dd93609b9-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.708231 4873 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.708242 4873 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2948a5a7-4d94-4314-acdf-489dd93609b9-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.708252 4873 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2948a5a7-4d94-4314-acdf-489dd93609b9-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.949279 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7hhjq"] Feb 19 09:51:42 crc kubenswrapper[4873]: I0219 09:51:42.956370 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-7hhjq"] Feb 19 09:51:43 crc kubenswrapper[4873]: I0219 09:51:43.491658 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2948a5a7-4d94-4314-acdf-489dd93609b9" path="/var/lib/kubelet/pods/2948a5a7-4d94-4314-acdf-489dd93609b9/volumes" Feb 19 09:51:48 crc kubenswrapper[4873]: I0219 09:51:48.240373 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:51:48 crc kubenswrapper[4873]: I0219 09:51:48.240818 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:51:48 crc kubenswrapper[4873]: I0219 09:51:48.240913 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:51:48 crc kubenswrapper[4873]: I0219 09:51:48.242873 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ecc9e74f65542c5ba1361ec123b0a6a0ddd50ca3d18c190393ca23d1531b88a"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:51:48 crc kubenswrapper[4873]: I0219 09:51:48.243232 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://5ecc9e74f65542c5ba1361ec123b0a6a0ddd50ca3d18c190393ca23d1531b88a" gracePeriod=600 Feb 19 09:51:48 crc kubenswrapper[4873]: I0219 09:51:48.669241 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="5ecc9e74f65542c5ba1361ec123b0a6a0ddd50ca3d18c190393ca23d1531b88a" exitCode=0 Feb 19 09:51:48 crc kubenswrapper[4873]: I0219 09:51:48.669298 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"5ecc9e74f65542c5ba1361ec123b0a6a0ddd50ca3d18c190393ca23d1531b88a"} Feb 19 09:51:48 crc kubenswrapper[4873]: I0219 09:51:48.669605 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"ebff3f80b0b9d54ded2014067bb39816bc67366aec6359774e3b0cd08dfce552"} Feb 19 09:51:48 crc kubenswrapper[4873]: I0219 09:51:48.669630 4873 scope.go:117] "RemoveContainer" containerID="9afd159bda3d2cbd676930ddf4df8cf39b5da5575d7c4d647ae91446f1b76837" Feb 19 09:53:48 crc kubenswrapper[4873]: I0219 09:53:48.240554 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:53:48 crc kubenswrapper[4873]: I0219 09:53:48.241210 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.193048 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-zhqgv"] Feb 19 09:54:12 crc kubenswrapper[4873]: E0219 09:54:12.193751 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2948a5a7-4d94-4314-acdf-489dd93609b9" containerName="registry" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.193766 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="2948a5a7-4d94-4314-acdf-489dd93609b9" containerName="registry" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.193867 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="2948a5a7-4d94-4314-acdf-489dd93609b9" containerName="registry" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.194210 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-zhqgv" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.196817 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.197329 4873 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-9xdhb" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.198513 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.203867 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-ckd42"] Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.204485 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ckd42" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.206139 4873 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-j6cgv" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.211440 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-zhqgv"] Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.227579 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fhd9c"] Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.228254 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-fhd9c" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.229993 4873 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-kzvgl" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.240610 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ckd42"] Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.244087 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fhd9c"] Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.250748 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pds7j\" (UniqueName: \"kubernetes.io/projected/084c90b4-3270-4f64-8c8c-1a96f05dc1fa-kube-api-access-pds7j\") pod \"cert-manager-cainjector-cf98fcc89-zhqgv\" (UID: \"084c90b4-3270-4f64-8c8c-1a96f05dc1fa\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-zhqgv" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.250838 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7nzs\" (UniqueName: \"kubernetes.io/projected/51fc361b-11a5-480a-a5b9-0eb4b7670e83-kube-api-access-z7nzs\") pod \"cert-manager-858654f9db-ckd42\" (UID: \"51fc361b-11a5-480a-a5b9-0eb4b7670e83\") " pod="cert-manager/cert-manager-858654f9db-ckd42" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.351606 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pds7j\" (UniqueName: \"kubernetes.io/projected/084c90b4-3270-4f64-8c8c-1a96f05dc1fa-kube-api-access-pds7j\") pod \"cert-manager-cainjector-cf98fcc89-zhqgv\" (UID: \"084c90b4-3270-4f64-8c8c-1a96f05dc1fa\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-zhqgv" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.351687 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7nzs\" (UniqueName: \"kubernetes.io/projected/51fc361b-11a5-480a-a5b9-0eb4b7670e83-kube-api-access-z7nzs\") pod \"cert-manager-858654f9db-ckd42\" (UID: \"51fc361b-11a5-480a-a5b9-0eb4b7670e83\") " pod="cert-manager/cert-manager-858654f9db-ckd42" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.351720 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n5z9\" (UniqueName: \"kubernetes.io/projected/2eebe311-368b-45b4-9e74-7442221e3785-kube-api-access-2n5z9\") pod \"cert-manager-webhook-687f57d79b-fhd9c\" (UID: \"2eebe311-368b-45b4-9e74-7442221e3785\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fhd9c" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.376645 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7nzs\" (UniqueName: \"kubernetes.io/projected/51fc361b-11a5-480a-a5b9-0eb4b7670e83-kube-api-access-z7nzs\") pod \"cert-manager-858654f9db-ckd42\" (UID: \"51fc361b-11a5-480a-a5b9-0eb4b7670e83\") " pod="cert-manager/cert-manager-858654f9db-ckd42" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.380831 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pds7j\" (UniqueName: \"kubernetes.io/projected/084c90b4-3270-4f64-8c8c-1a96f05dc1fa-kube-api-access-pds7j\") pod \"cert-manager-cainjector-cf98fcc89-zhqgv\" (UID: \"084c90b4-3270-4f64-8c8c-1a96f05dc1fa\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-zhqgv" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.452536 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n5z9\" (UniqueName: \"kubernetes.io/projected/2eebe311-368b-45b4-9e74-7442221e3785-kube-api-access-2n5z9\") pod \"cert-manager-webhook-687f57d79b-fhd9c\" (UID: \"2eebe311-368b-45b4-9e74-7442221e3785\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fhd9c" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.474190 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n5z9\" (UniqueName: \"kubernetes.io/projected/2eebe311-368b-45b4-9e74-7442221e3785-kube-api-access-2n5z9\") pod \"cert-manager-webhook-687f57d79b-fhd9c\" (UID: \"2eebe311-368b-45b4-9e74-7442221e3785\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fhd9c" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.520191 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-zhqgv" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.524752 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ckd42" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.540426 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-fhd9c" Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.773479 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-zhqgv"] Feb 19 09:54:12 crc kubenswrapper[4873]: I0219 09:54:12.781705 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:54:13 crc kubenswrapper[4873]: I0219 09:54:13.035271 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fhd9c"] Feb 19 09:54:13 crc kubenswrapper[4873]: W0219 09:54:13.037169 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51fc361b_11a5_480a_a5b9_0eb4b7670e83.slice/crio-3fcf653b376a5d232c21eb305f3bae7952639ec416199326fc9d2e887fc4cd51 WatchSource:0}: Error finding container 3fcf653b376a5d232c21eb305f3bae7952639ec416199326fc9d2e887fc4cd51: Status 404 returned error can't find the container with id 3fcf653b376a5d232c21eb305f3bae7952639ec416199326fc9d2e887fc4cd51 Feb 19 09:54:13 crc kubenswrapper[4873]: I0219 09:54:13.040896 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ckd42"] Feb 19 09:54:13 crc kubenswrapper[4873]: I0219 09:54:13.497304 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ckd42" event={"ID":"51fc361b-11a5-480a-a5b9-0eb4b7670e83","Type":"ContainerStarted","Data":"3fcf653b376a5d232c21eb305f3bae7952639ec416199326fc9d2e887fc4cd51"} Feb 19 09:54:13 crc kubenswrapper[4873]: I0219 09:54:13.497378 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-fhd9c" event={"ID":"2eebe311-368b-45b4-9e74-7442221e3785","Type":"ContainerStarted","Data":"fe3e858fa62b4f2a35aee25b4cfc4b7ad57f92c3e5819cec30c79d9d77529f37"} Feb 19 09:54:13 crc kubenswrapper[4873]: I0219 09:54:13.497401 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-zhqgv" event={"ID":"084c90b4-3270-4f64-8c8c-1a96f05dc1fa","Type":"ContainerStarted","Data":"f6699ab394ac1e0d1433a35c173c4d6f6d549698ab2e4369c59d6ac16377af59"} Feb 19 09:54:15 crc kubenswrapper[4873]: I0219 09:54:15.511641 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-zhqgv" event={"ID":"084c90b4-3270-4f64-8c8c-1a96f05dc1fa","Type":"ContainerStarted","Data":"478a0b97c98403eaaf3117b0eaf5a8fc83acb96c391a0b2999b1a23922abda02"} Feb 19 09:54:15 crc kubenswrapper[4873]: I0219 09:54:15.528521 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-zhqgv" podStartSLOduration=1.199860442 podStartE2EDuration="3.528492785s" podCreationTimestamp="2026-02-19 09:54:12 +0000 UTC" firstStartedPulling="2026-02-19 09:54:12.781417812 +0000 UTC m=+562.070849460" lastFinishedPulling="2026-02-19 09:54:15.110050165 +0000 UTC m=+564.399481803" observedRunningTime="2026-02-19 09:54:15.525276567 +0000 UTC m=+564.814708215" watchObservedRunningTime="2026-02-19 09:54:15.528492785 +0000 UTC m=+564.817924423" Feb 19 09:54:17 crc kubenswrapper[4873]: I0219 09:54:17.537787 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ckd42" event={"ID":"51fc361b-11a5-480a-a5b9-0eb4b7670e83","Type":"ContainerStarted","Data":"37903da37b7892f7f7f358707156238c37595ef3e70f73adcd90a7236e105f34"} Feb 19 09:54:17 crc kubenswrapper[4873]: I0219 09:54:17.540875 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-fhd9c" event={"ID":"2eebe311-368b-45b4-9e74-7442221e3785","Type":"ContainerStarted","Data":"3feba744fc116718f3cbb1c1da4b7e8c8c044a5f2e20e19597dc969073b648a9"} Feb 19 09:54:17 crc kubenswrapper[4873]: I0219 09:54:17.541187 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-fhd9c" Feb 19 09:54:17 crc kubenswrapper[4873]: I0219 09:54:17.557243 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-ckd42" podStartSLOduration=1.734597703 podStartE2EDuration="5.557226786s" podCreationTimestamp="2026-02-19 09:54:12 +0000 UTC" firstStartedPulling="2026-02-19 09:54:13.039307599 +0000 UTC m=+562.328739237" lastFinishedPulling="2026-02-19 09:54:16.861936672 +0000 UTC m=+566.151368320" observedRunningTime="2026-02-19 09:54:17.556617321 +0000 UTC m=+566.846048959" watchObservedRunningTime="2026-02-19 09:54:17.557226786 +0000 UTC m=+566.846658424" Feb 19 09:54:17 crc kubenswrapper[4873]: I0219 09:54:17.579006 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-fhd9c" podStartSLOduration=1.810077937 podStartE2EDuration="5.578991272s" podCreationTimestamp="2026-02-19 09:54:12 +0000 UTC" firstStartedPulling="2026-02-19 09:54:13.033867787 +0000 UTC m=+562.323299425" lastFinishedPulling="2026-02-19 09:54:16.802781122 +0000 UTC m=+566.092212760" observedRunningTime="2026-02-19 09:54:17.576936662 +0000 UTC m=+566.866368320" watchObservedRunningTime="2026-02-19 09:54:17.578991272 +0000 UTC m=+566.868422900" Feb 19 09:54:18 crc kubenswrapper[4873]: I0219 09:54:18.240963 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:54:18 crc kubenswrapper[4873]: I0219 09:54:18.241596 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.138866 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j94bh"] Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.139789 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3" gracePeriod=30 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.139845 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="northd" containerID="cri-o://fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2" gracePeriod=30 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.139832 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="nbdb" containerID="cri-o://cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6" gracePeriod=30 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.139951 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="kube-rbac-proxy-node" containerID="cri-o://531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe" gracePeriod=30 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.140015 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovn-acl-logging" containerID="cri-o://818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4" gracePeriod=30 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.140059 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovn-controller" containerID="cri-o://ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd" gracePeriod=30 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.140068 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="sbdb" containerID="cri-o://02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578" gracePeriod=30 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.197762 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" containerID="cri-o://fbe398acea08ecbb128c7f23474abd3c929b29591afd83ce34befc3c628c7ddb" gracePeriod=30 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.543828 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-fhd9c" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.577953 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovnkube-controller/3.log" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.583871 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovn-acl-logging/0.log" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.584539 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovn-controller/0.log" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585036 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="fbe398acea08ecbb128c7f23474abd3c929b29591afd83ce34befc3c628c7ddb" exitCode=0 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585075 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578" exitCode=0 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585084 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6" exitCode=0 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585095 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2" exitCode=0 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585130 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3" exitCode=0 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585127 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"fbe398acea08ecbb128c7f23474abd3c929b29591afd83ce34befc3c628c7ddb"} Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585172 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578"} Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585186 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6"} Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585199 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2"} Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585140 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe" exitCode=0 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585211 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3"} Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585218 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4" exitCode=143 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585223 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe"} Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585229 4873 generic.go:334] "Generic (PLEG): container finished" podID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerID="ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd" exitCode=143 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585399 4873 scope.go:117] "RemoveContainer" containerID="e579ee790b5fefd1d5bf854d00160acec2752733df35041a4f9ec15c5d947308" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585559 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4"} Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.585581 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd"} Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.588006 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4pk8x_e1ae3d8d-27cf-489f-a6ba-ef914db74bff/kube-multus/2.log" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.588660 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4pk8x_e1ae3d8d-27cf-489f-a6ba-ef914db74bff/kube-multus/1.log" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.588711 4873 generic.go:334] "Generic (PLEG): container finished" podID="e1ae3d8d-27cf-489f-a6ba-ef914db74bff" containerID="ef1d74ca48faafc4bbde6d98d0cbea070a166074ced1ae06003180d6fd64ebb2" exitCode=2 Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.588740 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4pk8x" event={"ID":"e1ae3d8d-27cf-489f-a6ba-ef914db74bff","Type":"ContainerDied","Data":"ef1d74ca48faafc4bbde6d98d0cbea070a166074ced1ae06003180d6fd64ebb2"} Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.589249 4873 scope.go:117] "RemoveContainer" containerID="ef1d74ca48faafc4bbde6d98d0cbea070a166074ced1ae06003180d6fd64ebb2" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.589504 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-4pk8x_openshift-multus(e1ae3d8d-27cf-489f-a6ba-ef914db74bff)\"" pod="openshift-multus/multus-4pk8x" podUID="e1ae3d8d-27cf-489f-a6ba-ef914db74bff" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.627125 4873 scope.go:117] "RemoveContainer" containerID="81ec7da29e3b03fb97f0d183d69bb256ed8f7340ca5df7e0c44bedd129b968cc" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.900586 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovn-acl-logging/0.log" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.901461 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovn-controller/0.log" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.903284 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.974707 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z4jgv"] Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.974927 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovn-acl-logging" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.974943 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovn-acl-logging" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.974953 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.974961 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.974973 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="sbdb" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.974981 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="sbdb" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.974989 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="nbdb" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.974997 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="nbdb" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.975006 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975013 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.975022 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="kubecfg-setup" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975030 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="kubecfg-setup" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.975041 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975047 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.975056 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975062 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.975071 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975078 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.975090 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="kube-rbac-proxy-node" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975096 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="kube-rbac-proxy-node" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.975208 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovn-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975219 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovn-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.975231 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="northd" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975240 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="northd" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975344 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="kube-rbac-proxy-node" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975355 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="nbdb" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975363 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975371 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="northd" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975382 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975392 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975399 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovn-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975410 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975419 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="sbdb" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975430 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovn-acl-logging" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975441 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975450 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: E0219 09:54:22.975557 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.975567 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" containerName="ovnkube-controller" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.977122 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998502 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-kubelet\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998552 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-ovn\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998585 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-systemd\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998624 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-script-lib\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998629 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998655 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7760a15-9ea0-42f0-b42b-72de30071d14-ovn-node-metrics-cert\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998732 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-env-overrides\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998761 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-bin\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998785 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-openvswitch\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998819 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-slash\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998858 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998917 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-node-log\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998948 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-netns\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.998980 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-systemd-units\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999013 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-config\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999038 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-etc-openvswitch\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999065 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-log-socket\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999094 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz7vl\" (UniqueName: \"kubernetes.io/projected/a7760a15-9ea0-42f0-b42b-72de30071d14-kube-api-access-vz7vl\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999138 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-netd\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999159 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-ovn-kubernetes\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999184 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-var-lib-openvswitch\") pod \"a7760a15-9ea0-42f0-b42b-72de30071d14\" (UID: \"a7760a15-9ea0-42f0-b42b-72de30071d14\") " Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999345 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-run-systemd\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999380 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-kubelet\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999424 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-cni-bin\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999445 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-env-overrides\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999472 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-node-log\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999493 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999516 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-systemd-units\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999535 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-run-openvswitch\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999582 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-etc-openvswitch\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999611 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-slash\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999639 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-cni-netd\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999666 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-run-netns\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999696 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-ovnkube-script-lib\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999728 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrbxf\" (UniqueName: \"kubernetes.io/projected/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-kube-api-access-nrbxf\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999753 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-run-ovn-kubernetes\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999782 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-var-lib-openvswitch\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999812 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-run-ovn\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999851 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-ovn-node-metrics-cert\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999886 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-log-socket\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:22 crc kubenswrapper[4873]: I0219 09:54:22.999927 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-ovnkube-config\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:22.999987 4873 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000024 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000087 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000159 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000238 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000255 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-slash" (OuterVolumeSpecName: "host-slash") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000301 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000304 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000315 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000312 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000325 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-node-log" (OuterVolumeSpecName: "node-log") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000346 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000380 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000390 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000392 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-log-socket" (OuterVolumeSpecName: "log-socket") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000683 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.000711 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.005486 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7760a15-9ea0-42f0-b42b-72de30071d14-kube-api-access-vz7vl" (OuterVolumeSpecName: "kube-api-access-vz7vl") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "kube-api-access-vz7vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.006984 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7760a15-9ea0-42f0-b42b-72de30071d14-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.029240 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a7760a15-9ea0-42f0-b42b-72de30071d14" (UID: "a7760a15-9ea0-42f0-b42b-72de30071d14"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.100821 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-run-systemd\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.100878 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-kubelet\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.100900 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-cni-bin\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.100915 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-env-overrides\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.100931 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-node-log\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.100946 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.100963 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-systemd-units\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.100978 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-run-openvswitch\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101001 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-etc-openvswitch\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101018 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-slash\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101035 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-cni-netd\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101050 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-run-netns\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101066 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-ovnkube-script-lib\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101083 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrbxf\" (UniqueName: \"kubernetes.io/projected/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-kube-api-access-nrbxf\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101095 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-run-ovn-kubernetes\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101130 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-var-lib-openvswitch\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101153 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-run-ovn\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101180 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-ovn-node-metrics-cert\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101200 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-log-socket\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101227 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-ovnkube-config\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101271 4873 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-slash\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101282 4873 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101291 4873 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-node-log\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101301 4873 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101309 4873 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101317 4873 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101325 4873 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101334 4873 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-log-socket\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101342 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz7vl\" (UniqueName: \"kubernetes.io/projected/a7760a15-9ea0-42f0-b42b-72de30071d14-kube-api-access-vz7vl\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101350 4873 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101360 4873 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101368 4873 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101377 4873 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101384 4873 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101392 4873 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101400 4873 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7760a15-9ea0-42f0-b42b-72de30071d14-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101409 4873 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7760a15-9ea0-42f0-b42b-72de30071d14-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101433 4873 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.101441 4873 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a7760a15-9ea0-42f0-b42b-72de30071d14-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102154 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-ovnkube-config\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102196 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-run-systemd\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102219 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-kubelet\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102239 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-cni-bin\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102550 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-env-overrides\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102582 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-node-log\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102604 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102624 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-systemd-units\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102642 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-run-openvswitch\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102661 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-etc-openvswitch\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102681 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-slash\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102706 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-cni-netd\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102728 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-run-netns\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.103013 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-log-socket\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.102978 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-var-lib-openvswitch\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.103023 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-host-run-ovn-kubernetes\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.103068 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-run-ovn\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.103157 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-ovnkube-script-lib\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.106264 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-ovn-node-metrics-cert\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.120739 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrbxf\" (UniqueName: \"kubernetes.io/projected/17ff5bfa-ab64-4787-83f7-a1c0f76e0e52-kube-api-access-nrbxf\") pod \"ovnkube-node-z4jgv\" (UID: \"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52\") " pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.290842 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.599553 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovn-acl-logging/0.log" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.600823 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j94bh_a7760a15-9ea0-42f0-b42b-72de30071d14/ovn-controller/0.log" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.601407 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" event={"ID":"a7760a15-9ea0-42f0-b42b-72de30071d14","Type":"ContainerDied","Data":"542002b7bbc20e4e4f7ed68e13539e1b5d49a0679ef11d6b86cc15c762bc318b"} Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.601462 4873 scope.go:117] "RemoveContainer" containerID="fbe398acea08ecbb128c7f23474abd3c929b29591afd83ce34befc3c628c7ddb" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.601598 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j94bh" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.604801 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4pk8x_e1ae3d8d-27cf-489f-a6ba-ef914db74bff/kube-multus/2.log" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.608266 4873 generic.go:334] "Generic (PLEG): container finished" podID="17ff5bfa-ab64-4787-83f7-a1c0f76e0e52" containerID="df2fccfc9c89e51ac4652b545fbabaef56297c048277b07a756768a2b0ee26f0" exitCode=0 Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.608331 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" event={"ID":"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52","Type":"ContainerDied","Data":"df2fccfc9c89e51ac4652b545fbabaef56297c048277b07a756768a2b0ee26f0"} Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.608381 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" event={"ID":"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52","Type":"ContainerStarted","Data":"73206db7004dbb28fd56f4c709771c027dca64cc0dcf37bbaad7dab94fa42938"} Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.630523 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j94bh"] Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.631003 4873 scope.go:117] "RemoveContainer" containerID="02bb2bc90cc5fcd65d83ff7114e7523b46ae5d732b1201b2e60f7514ee3bb578" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.636072 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j94bh"] Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.660596 4873 scope.go:117] "RemoveContainer" containerID="cea4929f2a8788170b1d6989bdf1e48c5bd784c7bbb9b61b05c097142900d3a6" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.674196 4873 scope.go:117] "RemoveContainer" containerID="fff94e9f6ecb5d589a39fbb6470fba09f150e5017921cd07183dcf45906b6bd2" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.697890 4873 scope.go:117] "RemoveContainer" containerID="c3864b7b7b0262329654b4163ef0f78770d072126fe647477af8481dcc7dbfa3" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.717632 4873 scope.go:117] "RemoveContainer" containerID="531c5f175091712c62d61b9bd246031a1be5625e2035eb74646b9a4a8ebbacfe" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.744303 4873 scope.go:117] "RemoveContainer" containerID="818daed009518f5262f167ecc83ecfbebaed5e714be5457d6c1dbb51d83ed9d4" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.759470 4873 scope.go:117] "RemoveContainer" containerID="ca55e531374d74576c0356953177e827ea0a40d99f6d868d58da80ac67053dbd" Feb 19 09:54:23 crc kubenswrapper[4873]: I0219 09:54:23.775194 4873 scope.go:117] "RemoveContainer" containerID="d6f649d2d74dd0c4285fa3b98974ee330db985e62e9fff24b68e7a028045427b" Feb 19 09:54:24 crc kubenswrapper[4873]: I0219 09:54:24.622061 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" event={"ID":"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52","Type":"ContainerStarted","Data":"b37cdcbfc01849c4081c33074d91a9bdb9d7910165482ec52d5046f02416ab2e"} Feb 19 09:54:24 crc kubenswrapper[4873]: I0219 09:54:24.622415 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" event={"ID":"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52","Type":"ContainerStarted","Data":"90876eee50f60380d3803482056291ca7cbb66d4a94bd642bd4badfbdc28003a"} Feb 19 09:54:24 crc kubenswrapper[4873]: I0219 09:54:24.622428 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" event={"ID":"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52","Type":"ContainerStarted","Data":"8d6930243d79124ba278babd506e04b2816a774458ee69a11165bcea2afc3b95"} Feb 19 09:54:24 crc kubenswrapper[4873]: I0219 09:54:24.622437 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" event={"ID":"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52","Type":"ContainerStarted","Data":"7ba7a2fc66e426dad5d3e319ad518d6b99e8a5cb9bfb0c6e6909886ab6142f58"} Feb 19 09:54:24 crc kubenswrapper[4873]: I0219 09:54:24.622445 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" event={"ID":"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52","Type":"ContainerStarted","Data":"f16467f0dcec222b18b211fd35db0821cd93682069022e8e026753865e7ae207"} Feb 19 09:54:24 crc kubenswrapper[4873]: I0219 09:54:24.622453 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" event={"ID":"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52","Type":"ContainerStarted","Data":"753dd2222e3bbd0c8433663f5fc6838c82629fe0b2d6214734f2693b625cc40d"} Feb 19 09:54:25 crc kubenswrapper[4873]: I0219 09:54:25.491638 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7760a15-9ea0-42f0-b42b-72de30071d14" path="/var/lib/kubelet/pods/a7760a15-9ea0-42f0-b42b-72de30071d14/volumes" Feb 19 09:54:26 crc kubenswrapper[4873]: I0219 09:54:26.641927 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" event={"ID":"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52","Type":"ContainerStarted","Data":"e430a338971dfc01c5ac6b0ee1465873f07d954a8715391aec85b6cf9139df0a"} Feb 19 09:54:29 crc kubenswrapper[4873]: I0219 09:54:29.664732 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" event={"ID":"17ff5bfa-ab64-4787-83f7-a1c0f76e0e52","Type":"ContainerStarted","Data":"f69bf6caa1dcadd82d7ed2e8a149250996041d06b3becc88823f02533f26d14d"} Feb 19 09:54:29 crc kubenswrapper[4873]: I0219 09:54:29.665127 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:29 crc kubenswrapper[4873]: I0219 09:54:29.665315 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:29 crc kubenswrapper[4873]: I0219 09:54:29.665326 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:29 crc kubenswrapper[4873]: I0219 09:54:29.701436 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:29 crc kubenswrapper[4873]: I0219 09:54:29.708031 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" podStartSLOduration=7.708013731 podStartE2EDuration="7.708013731s" podCreationTimestamp="2026-02-19 09:54:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:54:29.704323242 +0000 UTC m=+578.993754880" watchObservedRunningTime="2026-02-19 09:54:29.708013731 +0000 UTC m=+578.997445379" Feb 19 09:54:29 crc kubenswrapper[4873]: I0219 09:54:29.708812 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:34 crc kubenswrapper[4873]: I0219 09:54:34.484858 4873 scope.go:117] "RemoveContainer" containerID="ef1d74ca48faafc4bbde6d98d0cbea070a166074ced1ae06003180d6fd64ebb2" Feb 19 09:54:34 crc kubenswrapper[4873]: E0219 09:54:34.485629 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-4pk8x_openshift-multus(e1ae3d8d-27cf-489f-a6ba-ef914db74bff)\"" pod="openshift-multus/multus-4pk8x" podUID="e1ae3d8d-27cf-489f-a6ba-ef914db74bff" Feb 19 09:54:47 crc kubenswrapper[4873]: I0219 09:54:47.484639 4873 scope.go:117] "RemoveContainer" containerID="ef1d74ca48faafc4bbde6d98d0cbea070a166074ced1ae06003180d6fd64ebb2" Feb 19 09:54:48 crc kubenswrapper[4873]: I0219 09:54:48.209492 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4pk8x_e1ae3d8d-27cf-489f-a6ba-ef914db74bff/kube-multus/2.log" Feb 19 09:54:48 crc kubenswrapper[4873]: I0219 09:54:48.209737 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4pk8x" event={"ID":"e1ae3d8d-27cf-489f-a6ba-ef914db74bff","Type":"ContainerStarted","Data":"6718dd929284093b0608531f70803abc21ce790d2867e131107c90a949a950c0"} Feb 19 09:54:48 crc kubenswrapper[4873]: I0219 09:54:48.240549 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:54:48 crc kubenswrapper[4873]: I0219 09:54:48.240602 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:54:48 crc kubenswrapper[4873]: I0219 09:54:48.240641 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:54:48 crc kubenswrapper[4873]: I0219 09:54:48.241170 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ebff3f80b0b9d54ded2014067bb39816bc67366aec6359774e3b0cd08dfce552"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:54:48 crc kubenswrapper[4873]: I0219 09:54:48.241225 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://ebff3f80b0b9d54ded2014067bb39816bc67366aec6359774e3b0cd08dfce552" gracePeriod=600 Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.217865 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="ebff3f80b0b9d54ded2014067bb39816bc67366aec6359774e3b0cd08dfce552" exitCode=0 Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.217940 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"ebff3f80b0b9d54ded2014067bb39816bc67366aec6359774e3b0cd08dfce552"} Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.218165 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"025da7fd171f987961d862fe4ebef489eca80227003392ad78806aa501904663"} Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.218185 4873 scope.go:117] "RemoveContainer" containerID="5ecc9e74f65542c5ba1361ec123b0a6a0ddd50ca3d18c190393ca23d1531b88a" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.469893 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn"] Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.471546 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.476467 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.482791 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn"] Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.534966 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.535023 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.535086 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgmn7\" (UniqueName: \"kubernetes.io/projected/0709e82b-60e9-4aed-8e42-e39928e74c21-kube-api-access-xgmn7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.636824 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgmn7\" (UniqueName: \"kubernetes.io/projected/0709e82b-60e9-4aed-8e42-e39928e74c21-kube-api-access-xgmn7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.637023 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.637066 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.637737 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.637951 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.671699 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgmn7\" (UniqueName: \"kubernetes.io/projected/0709e82b-60e9-4aed-8e42-e39928e74c21-kube-api-access-xgmn7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:49 crc kubenswrapper[4873]: I0219 09:54:49.786967 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:50 crc kubenswrapper[4873]: I0219 09:54:50.202791 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn"] Feb 19 09:54:50 crc kubenswrapper[4873]: W0219 09:54:50.206347 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0709e82b_60e9_4aed_8e42_e39928e74c21.slice/crio-a8faa0998d8cb7380721d946dec8c42971f2c370b0eddff7fbd229e41aab774a WatchSource:0}: Error finding container a8faa0998d8cb7380721d946dec8c42971f2c370b0eddff7fbd229e41aab774a: Status 404 returned error can't find the container with id a8faa0998d8cb7380721d946dec8c42971f2c370b0eddff7fbd229e41aab774a Feb 19 09:54:50 crc kubenswrapper[4873]: I0219 09:54:50.230292 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" event={"ID":"0709e82b-60e9-4aed-8e42-e39928e74c21","Type":"ContainerStarted","Data":"a8faa0998d8cb7380721d946dec8c42971f2c370b0eddff7fbd229e41aab774a"} Feb 19 09:54:51 crc kubenswrapper[4873]: I0219 09:54:51.238966 4873 generic.go:334] "Generic (PLEG): container finished" podID="0709e82b-60e9-4aed-8e42-e39928e74c21" containerID="53795ed9c172f5e15725bb68ee5247725d9d892ce1cec0e62f07e21ea580e8d5" exitCode=0 Feb 19 09:54:51 crc kubenswrapper[4873]: I0219 09:54:51.239050 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" event={"ID":"0709e82b-60e9-4aed-8e42-e39928e74c21","Type":"ContainerDied","Data":"53795ed9c172f5e15725bb68ee5247725d9d892ce1cec0e62f07e21ea580e8d5"} Feb 19 09:54:52 crc kubenswrapper[4873]: I0219 09:54:52.245507 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" event={"ID":"0709e82b-60e9-4aed-8e42-e39928e74c21","Type":"ContainerStarted","Data":"3249a389c2df46ad94e087432a2b02af4fdb5db764822d0ff5a78c43d9aa131d"} Feb 19 09:54:53 crc kubenswrapper[4873]: I0219 09:54:53.252098 4873 generic.go:334] "Generic (PLEG): container finished" podID="0709e82b-60e9-4aed-8e42-e39928e74c21" containerID="3249a389c2df46ad94e087432a2b02af4fdb5db764822d0ff5a78c43d9aa131d" exitCode=0 Feb 19 09:54:53 crc kubenswrapper[4873]: I0219 09:54:53.252179 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" event={"ID":"0709e82b-60e9-4aed-8e42-e39928e74c21","Type":"ContainerDied","Data":"3249a389c2df46ad94e087432a2b02af4fdb5db764822d0ff5a78c43d9aa131d"} Feb 19 09:54:53 crc kubenswrapper[4873]: I0219 09:54:53.316511 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z4jgv" Feb 19 09:54:54 crc kubenswrapper[4873]: I0219 09:54:54.260140 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" event={"ID":"0709e82b-60e9-4aed-8e42-e39928e74c21","Type":"ContainerStarted","Data":"7e8e8aee15311b068eb3c75635d192c9297ffb7e135a97626a265ba209a8876f"} Feb 19 09:54:54 crc kubenswrapper[4873]: I0219 09:54:54.283193 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" podStartSLOduration=4.4068980700000004 podStartE2EDuration="5.283166783s" podCreationTimestamp="2026-02-19 09:54:49 +0000 UTC" firstStartedPulling="2026-02-19 09:54:51.241586638 +0000 UTC m=+600.531018316" lastFinishedPulling="2026-02-19 09:54:52.117855391 +0000 UTC m=+601.407287029" observedRunningTime="2026-02-19 09:54:54.279749644 +0000 UTC m=+603.569181322" watchObservedRunningTime="2026-02-19 09:54:54.283166783 +0000 UTC m=+603.572598461" Feb 19 09:54:55 crc kubenswrapper[4873]: I0219 09:54:55.270647 4873 generic.go:334] "Generic (PLEG): container finished" podID="0709e82b-60e9-4aed-8e42-e39928e74c21" containerID="7e8e8aee15311b068eb3c75635d192c9297ffb7e135a97626a265ba209a8876f" exitCode=0 Feb 19 09:54:55 crc kubenswrapper[4873]: I0219 09:54:55.270695 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" event={"ID":"0709e82b-60e9-4aed-8e42-e39928e74c21","Type":"ContainerDied","Data":"7e8e8aee15311b068eb3c75635d192c9297ffb7e135a97626a265ba209a8876f"} Feb 19 09:54:56 crc kubenswrapper[4873]: I0219 09:54:56.595318 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:54:56 crc kubenswrapper[4873]: I0219 09:54:56.729113 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-util\") pod \"0709e82b-60e9-4aed-8e42-e39928e74c21\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " Feb 19 09:54:56 crc kubenswrapper[4873]: I0219 09:54:56.729534 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgmn7\" (UniqueName: \"kubernetes.io/projected/0709e82b-60e9-4aed-8e42-e39928e74c21-kube-api-access-xgmn7\") pod \"0709e82b-60e9-4aed-8e42-e39928e74c21\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " Feb 19 09:54:56 crc kubenswrapper[4873]: I0219 09:54:56.729624 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-bundle\") pod \"0709e82b-60e9-4aed-8e42-e39928e74c21\" (UID: \"0709e82b-60e9-4aed-8e42-e39928e74c21\") " Feb 19 09:54:56 crc kubenswrapper[4873]: I0219 09:54:56.732121 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-bundle" (OuterVolumeSpecName: "bundle") pod "0709e82b-60e9-4aed-8e42-e39928e74c21" (UID: "0709e82b-60e9-4aed-8e42-e39928e74c21"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:54:56 crc kubenswrapper[4873]: I0219 09:54:56.737627 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0709e82b-60e9-4aed-8e42-e39928e74c21-kube-api-access-xgmn7" (OuterVolumeSpecName: "kube-api-access-xgmn7") pod "0709e82b-60e9-4aed-8e42-e39928e74c21" (UID: "0709e82b-60e9-4aed-8e42-e39928e74c21"). InnerVolumeSpecName "kube-api-access-xgmn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:54:56 crc kubenswrapper[4873]: I0219 09:54:56.751775 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-util" (OuterVolumeSpecName: "util") pod "0709e82b-60e9-4aed-8e42-e39928e74c21" (UID: "0709e82b-60e9-4aed-8e42-e39928e74c21"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:54:56 crc kubenswrapper[4873]: I0219 09:54:56.831192 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgmn7\" (UniqueName: \"kubernetes.io/projected/0709e82b-60e9-4aed-8e42-e39928e74c21-kube-api-access-xgmn7\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:56 crc kubenswrapper[4873]: I0219 09:54:56.831377 4873 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:56 crc kubenswrapper[4873]: I0219 09:54:56.831524 4873 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0709e82b-60e9-4aed-8e42-e39928e74c21-util\") on node \"crc\" DevicePath \"\"" Feb 19 09:54:57 crc kubenswrapper[4873]: I0219 09:54:57.288168 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" event={"ID":"0709e82b-60e9-4aed-8e42-e39928e74c21","Type":"ContainerDied","Data":"a8faa0998d8cb7380721d946dec8c42971f2c370b0eddff7fbd229e41aab774a"} Feb 19 09:54:57 crc kubenswrapper[4873]: I0219 09:54:57.288219 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8faa0998d8cb7380721d946dec8c42971f2c370b0eddff7fbd229e41aab774a" Feb 19 09:54:57 crc kubenswrapper[4873]: I0219 09:54:57.288319 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.276212 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-v7nww"] Feb 19 09:55:06 crc kubenswrapper[4873]: E0219 09:55:06.277012 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0709e82b-60e9-4aed-8e42-e39928e74c21" containerName="util" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.277047 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0709e82b-60e9-4aed-8e42-e39928e74c21" containerName="util" Feb 19 09:55:06 crc kubenswrapper[4873]: E0219 09:55:06.277068 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0709e82b-60e9-4aed-8e42-e39928e74c21" containerName="pull" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.277076 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0709e82b-60e9-4aed-8e42-e39928e74c21" containerName="pull" Feb 19 09:55:06 crc kubenswrapper[4873]: E0219 09:55:06.277092 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0709e82b-60e9-4aed-8e42-e39928e74c21" containerName="extract" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.277114 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0709e82b-60e9-4aed-8e42-e39928e74c21" containerName="extract" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.277237 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0709e82b-60e9-4aed-8e42-e39928e74c21" containerName="extract" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.277682 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7nww" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.282030 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-xz4ck" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.282051 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.283783 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.289565 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-v7nww"] Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.348232 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4q86\" (UniqueName: \"kubernetes.io/projected/5d79d4d8-e595-4aec-bc0b-7347b826c257-kube-api-access-h4q86\") pod \"obo-prometheus-operator-68bc856cb9-v7nww\" (UID: \"5d79d4d8-e595-4aec-bc0b-7347b826c257\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7nww" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.398337 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7"] Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.399215 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.401309 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.401361 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-gdtr7" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.408133 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb"] Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.408953 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.414306 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7"] Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.424893 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb"] Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.452799 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4q86\" (UniqueName: \"kubernetes.io/projected/5d79d4d8-e595-4aec-bc0b-7347b826c257-kube-api-access-h4q86\") pod \"obo-prometheus-operator-68bc856cb9-v7nww\" (UID: \"5d79d4d8-e595-4aec-bc0b-7347b826c257\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7nww" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.473400 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4q86\" (UniqueName: \"kubernetes.io/projected/5d79d4d8-e595-4aec-bc0b-7347b826c257-kube-api-access-h4q86\") pod \"obo-prometheus-operator-68bc856cb9-v7nww\" (UID: \"5d79d4d8-e595-4aec-bc0b-7347b826c257\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7nww" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.554508 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4724c979-0040-4017-86ce-78d2a8bdb44e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-qptdb\" (UID: \"4724c979-0040-4017-86ce-78d2a8bdb44e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.554800 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3180318c-7d9a-454b-8de4-887fabae362b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7\" (UID: \"3180318c-7d9a-454b-8de4-887fabae362b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.554856 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4724c979-0040-4017-86ce-78d2a8bdb44e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-qptdb\" (UID: \"4724c979-0040-4017-86ce-78d2a8bdb44e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.554896 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3180318c-7d9a-454b-8de4-887fabae362b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7\" (UID: \"3180318c-7d9a-454b-8de4-887fabae362b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.592049 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-7wtlv"] Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.592908 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.593405 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7nww" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.595804 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-xzt2d" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.595804 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.655442 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssjwl\" (UniqueName: \"kubernetes.io/projected/b23281d2-935e-47c1-bc83-8d00c7649625-kube-api-access-ssjwl\") pod \"observability-operator-59bdc8b94-7wtlv\" (UID: \"b23281d2-935e-47c1-bc83-8d00c7649625\") " pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.655501 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4724c979-0040-4017-86ce-78d2a8bdb44e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-qptdb\" (UID: \"4724c979-0040-4017-86ce-78d2a8bdb44e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.655535 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3180318c-7d9a-454b-8de4-887fabae362b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7\" (UID: \"3180318c-7d9a-454b-8de4-887fabae362b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.655558 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4724c979-0040-4017-86ce-78d2a8bdb44e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-qptdb\" (UID: \"4724c979-0040-4017-86ce-78d2a8bdb44e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.655593 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3180318c-7d9a-454b-8de4-887fabae362b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7\" (UID: \"3180318c-7d9a-454b-8de4-887fabae362b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.655620 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b23281d2-935e-47c1-bc83-8d00c7649625-observability-operator-tls\") pod \"observability-operator-59bdc8b94-7wtlv\" (UID: \"b23281d2-935e-47c1-bc83-8d00c7649625\") " pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.658966 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-7wtlv"] Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.659132 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4724c979-0040-4017-86ce-78d2a8bdb44e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-qptdb\" (UID: \"4724c979-0040-4017-86ce-78d2a8bdb44e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.659357 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4724c979-0040-4017-86ce-78d2a8bdb44e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-qptdb\" (UID: \"4724c979-0040-4017-86ce-78d2a8bdb44e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.659549 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3180318c-7d9a-454b-8de4-887fabae362b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7\" (UID: \"3180318c-7d9a-454b-8de4-887fabae362b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.679893 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3180318c-7d9a-454b-8de4-887fabae362b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7\" (UID: \"3180318c-7d9a-454b-8de4-887fabae362b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.715411 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.739913 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.761035 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b23281d2-935e-47c1-bc83-8d00c7649625-observability-operator-tls\") pod \"observability-operator-59bdc8b94-7wtlv\" (UID: \"b23281d2-935e-47c1-bc83-8d00c7649625\") " pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.761090 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssjwl\" (UniqueName: \"kubernetes.io/projected/b23281d2-935e-47c1-bc83-8d00c7649625-kube-api-access-ssjwl\") pod \"observability-operator-59bdc8b94-7wtlv\" (UID: \"b23281d2-935e-47c1-bc83-8d00c7649625\") " pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.777140 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b23281d2-935e-47c1-bc83-8d00c7649625-observability-operator-tls\") pod \"observability-operator-59bdc8b94-7wtlv\" (UID: \"b23281d2-935e-47c1-bc83-8d00c7649625\") " pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.788827 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssjwl\" (UniqueName: \"kubernetes.io/projected/b23281d2-935e-47c1-bc83-8d00c7649625-kube-api-access-ssjwl\") pod \"observability-operator-59bdc8b94-7wtlv\" (UID: \"b23281d2-935e-47c1-bc83-8d00c7649625\") " pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.804412 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-8sflg"] Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.805096 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8sflg" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.807447 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-h248j" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.809103 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-8sflg"] Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.861975 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22x24\" (UniqueName: \"kubernetes.io/projected/ea1cc2c7-c932-4b3d-b718-d017eb06163f-kube-api-access-22x24\") pod \"perses-operator-5bf474d74f-8sflg\" (UID: \"ea1cc2c7-c932-4b3d-b718-d017eb06163f\") " pod="openshift-operators/perses-operator-5bf474d74f-8sflg" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.862040 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea1cc2c7-c932-4b3d-b718-d017eb06163f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-8sflg\" (UID: \"ea1cc2c7-c932-4b3d-b718-d017eb06163f\") " pod="openshift-operators/perses-operator-5bf474d74f-8sflg" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.913007 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.967161 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22x24\" (UniqueName: \"kubernetes.io/projected/ea1cc2c7-c932-4b3d-b718-d017eb06163f-kube-api-access-22x24\") pod \"perses-operator-5bf474d74f-8sflg\" (UID: \"ea1cc2c7-c932-4b3d-b718-d017eb06163f\") " pod="openshift-operators/perses-operator-5bf474d74f-8sflg" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.967224 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea1cc2c7-c932-4b3d-b718-d017eb06163f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-8sflg\" (UID: \"ea1cc2c7-c932-4b3d-b718-d017eb06163f\") " pod="openshift-operators/perses-operator-5bf474d74f-8sflg" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.967996 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea1cc2c7-c932-4b3d-b718-d017eb06163f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-8sflg\" (UID: \"ea1cc2c7-c932-4b3d-b718-d017eb06163f\") " pod="openshift-operators/perses-operator-5bf474d74f-8sflg" Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.974796 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-v7nww"] Feb 19 09:55:06 crc kubenswrapper[4873]: I0219 09:55:06.987263 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22x24\" (UniqueName: \"kubernetes.io/projected/ea1cc2c7-c932-4b3d-b718-d017eb06163f-kube-api-access-22x24\") pod \"perses-operator-5bf474d74f-8sflg\" (UID: \"ea1cc2c7-c932-4b3d-b718-d017eb06163f\") " pod="openshift-operators/perses-operator-5bf474d74f-8sflg" Feb 19 09:55:07 crc kubenswrapper[4873]: I0219 09:55:07.054090 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7"] Feb 19 09:55:07 crc kubenswrapper[4873]: I0219 09:55:07.094474 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb"] Feb 19 09:55:07 crc kubenswrapper[4873]: I0219 09:55:07.139934 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8sflg" Feb 19 09:55:07 crc kubenswrapper[4873]: I0219 09:55:07.227235 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-7wtlv"] Feb 19 09:55:07 crc kubenswrapper[4873]: I0219 09:55:07.378243 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" event={"ID":"b23281d2-935e-47c1-bc83-8d00c7649625","Type":"ContainerStarted","Data":"66266c4dcb442623b33dce694f35f2f8d293e2117850e91f3ac0c23322c1e20a"} Feb 19 09:55:07 crc kubenswrapper[4873]: I0219 09:55:07.380566 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7nww" event={"ID":"5d79d4d8-e595-4aec-bc0b-7347b826c257","Type":"ContainerStarted","Data":"2bd1ed5d5780e1aabb0d34f30fbdd2a14ead6a502944d4dab60b868012abf811"} Feb 19 09:55:07 crc kubenswrapper[4873]: I0219 09:55:07.381734 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7" event={"ID":"3180318c-7d9a-454b-8de4-887fabae362b","Type":"ContainerStarted","Data":"41e45e92597c28002120a74d32a9fa24ac5457f23a3a4781e03d58219c226e05"} Feb 19 09:55:07 crc kubenswrapper[4873]: I0219 09:55:07.383138 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb" event={"ID":"4724c979-0040-4017-86ce-78d2a8bdb44e","Type":"ContainerStarted","Data":"2e50cdba4d45ebf1bc8d403a88feb427a8c70b23b0858b1d75be00f2ec8c8e34"} Feb 19 09:55:07 crc kubenswrapper[4873]: I0219 09:55:07.538380 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-8sflg"] Feb 19 09:55:07 crc kubenswrapper[4873]: W0219 09:55:07.539930 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea1cc2c7_c932_4b3d_b718_d017eb06163f.slice/crio-1e978609538a8f9d3cd46e166066d50f8c9196ee54490f470b7d3182bdd86b03 WatchSource:0}: Error finding container 1e978609538a8f9d3cd46e166066d50f8c9196ee54490f470b7d3182bdd86b03: Status 404 returned error can't find the container with id 1e978609538a8f9d3cd46e166066d50f8c9196ee54490f470b7d3182bdd86b03 Feb 19 09:55:08 crc kubenswrapper[4873]: I0219 09:55:08.407428 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-8sflg" event={"ID":"ea1cc2c7-c932-4b3d-b718-d017eb06163f","Type":"ContainerStarted","Data":"1e978609538a8f9d3cd46e166066d50f8c9196ee54490f470b7d3182bdd86b03"} Feb 19 09:55:16 crc kubenswrapper[4873]: I0219 09:55:16.454752 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" event={"ID":"b23281d2-935e-47c1-bc83-8d00c7649625","Type":"ContainerStarted","Data":"954d062155d7c95c0378e0a95b64ba43e82ec0a50bc1e31f4897792f425f4997"} Feb 19 09:55:16 crc kubenswrapper[4873]: I0219 09:55:16.455298 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" Feb 19 09:55:16 crc kubenswrapper[4873]: I0219 09:55:16.455980 4873 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-7wtlv container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.36:8081/healthz\": dial tcp 10.217.0.36:8081: connect: connection refused" start-of-body= Feb 19 09:55:16 crc kubenswrapper[4873]: I0219 09:55:16.456033 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" podUID="b23281d2-935e-47c1-bc83-8d00c7649625" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.36:8081/healthz\": dial tcp 10.217.0.36:8081: connect: connection refused" Feb 19 09:55:17 crc kubenswrapper[4873]: I0219 09:55:17.001252 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" Feb 19 09:55:17 crc kubenswrapper[4873]: I0219 09:55:17.034186 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-7wtlv" podStartSLOduration=2.058750119 podStartE2EDuration="11.034162612s" podCreationTimestamp="2026-02-19 09:55:06 +0000 UTC" firstStartedPulling="2026-02-19 09:55:07.24648909 +0000 UTC m=+616.535920728" lastFinishedPulling="2026-02-19 09:55:16.221901583 +0000 UTC m=+625.511333221" observedRunningTime="2026-02-19 09:55:16.476852306 +0000 UTC m=+625.766283934" watchObservedRunningTime="2026-02-19 09:55:17.034162612 +0000 UTC m=+626.323594260" Feb 19 09:55:17 crc kubenswrapper[4873]: I0219 09:55:17.464367 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-8sflg" event={"ID":"ea1cc2c7-c932-4b3d-b718-d017eb06163f","Type":"ContainerStarted","Data":"dffb83fa120d1dc7bd80035ea64649aa1137fd7ec5c7998514e9681f860bc874"} Feb 19 09:55:17 crc kubenswrapper[4873]: I0219 09:55:17.464515 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-8sflg" Feb 19 09:55:17 crc kubenswrapper[4873]: I0219 09:55:17.466915 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7" event={"ID":"3180318c-7d9a-454b-8de4-887fabae362b","Type":"ContainerStarted","Data":"e3078666cdb9158697244ee3b9dd6b856404687a4bb3ce880f9f57c7de9a2e3e"} Feb 19 09:55:17 crc kubenswrapper[4873]: I0219 09:55:17.468901 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb" event={"ID":"4724c979-0040-4017-86ce-78d2a8bdb44e","Type":"ContainerStarted","Data":"b1e1720c28432393ef1b4d621cfbf0b0639493691ee2131d04b35d73ce3957f0"} Feb 19 09:55:17 crc kubenswrapper[4873]: I0219 09:55:17.471211 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7nww" event={"ID":"5d79d4d8-e595-4aec-bc0b-7347b826c257","Type":"ContainerStarted","Data":"82be0e790e03da7caac4e8f6fd3f220fcb680a98df8bf464f9d8fad94d2cdeec"} Feb 19 09:55:17 crc kubenswrapper[4873]: I0219 09:55:17.502812 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-8sflg" podStartSLOduration=2.866144194 podStartE2EDuration="11.502786066s" podCreationTimestamp="2026-02-19 09:55:06 +0000 UTC" firstStartedPulling="2026-02-19 09:55:07.544040552 +0000 UTC m=+616.833472190" lastFinishedPulling="2026-02-19 09:55:16.180682434 +0000 UTC m=+625.470114062" observedRunningTime="2026-02-19 09:55:17.501953786 +0000 UTC m=+626.791385464" watchObservedRunningTime="2026-02-19 09:55:17.502786066 +0000 UTC m=+626.792217704" Feb 19 09:55:17 crc kubenswrapper[4873]: I0219 09:55:17.535872 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7" podStartSLOduration=2.441441962 podStartE2EDuration="11.535843697s" podCreationTimestamp="2026-02-19 09:55:06 +0000 UTC" firstStartedPulling="2026-02-19 09:55:07.082455861 +0000 UTC m=+616.371887499" lastFinishedPulling="2026-02-19 09:55:16.176857596 +0000 UTC m=+625.466289234" observedRunningTime="2026-02-19 09:55:17.527407033 +0000 UTC m=+626.816838681" watchObservedRunningTime="2026-02-19 09:55:17.535843697 +0000 UTC m=+626.825275345" Feb 19 09:55:17 crc kubenswrapper[4873]: I0219 09:55:17.554175 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7496f9f864-qptdb" podStartSLOduration=2.487956504 podStartE2EDuration="11.554155539s" podCreationTimestamp="2026-02-19 09:55:06 +0000 UTC" firstStartedPulling="2026-02-19 09:55:07.109832212 +0000 UTC m=+616.399263850" lastFinishedPulling="2026-02-19 09:55:16.176031237 +0000 UTC m=+625.465462885" observedRunningTime="2026-02-19 09:55:17.549591864 +0000 UTC m=+626.839023552" watchObservedRunningTime="2026-02-19 09:55:17.554155539 +0000 UTC m=+626.843587207" Feb 19 09:55:27 crc kubenswrapper[4873]: I0219 09:55:27.144846 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-8sflg" Feb 19 09:55:27 crc kubenswrapper[4873]: I0219 09:55:27.162822 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7nww" podStartSLOduration=11.973090209 podStartE2EDuration="21.162791398s" podCreationTimestamp="2026-02-19 09:55:06 +0000 UTC" firstStartedPulling="2026-02-19 09:55:07.009261956 +0000 UTC m=+616.298693594" lastFinishedPulling="2026-02-19 09:55:16.198963145 +0000 UTC m=+625.488394783" observedRunningTime="2026-02-19 09:55:17.621677904 +0000 UTC m=+626.911109542" watchObservedRunningTime="2026-02-19 09:55:27.162791398 +0000 UTC m=+636.452223086" Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.594130 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv"] Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.596017 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.597865 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.610294 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv"] Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.795250 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.795325 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.795347 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkj5g\" (UniqueName: \"kubernetes.io/projected/14a07337-b89d-4574-aa0f-f9a3cdebdd48-kube-api-access-jkj5g\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.896557 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.896639 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.896663 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkj5g\" (UniqueName: \"kubernetes.io/projected/14a07337-b89d-4574-aa0f-f9a3cdebdd48-kube-api-access-jkj5g\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.897096 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.897448 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:43 crc kubenswrapper[4873]: I0219 09:55:43.930043 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkj5g\" (UniqueName: \"kubernetes.io/projected/14a07337-b89d-4574-aa0f-f9a3cdebdd48-kube-api-access-jkj5g\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:44 crc kubenswrapper[4873]: I0219 09:55:44.218708 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:44 crc kubenswrapper[4873]: I0219 09:55:44.643504 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv"] Feb 19 09:55:45 crc kubenswrapper[4873]: I0219 09:55:45.646742 4873 generic.go:334] "Generic (PLEG): container finished" podID="14a07337-b89d-4574-aa0f-f9a3cdebdd48" containerID="a6819f539c003d3080c8371d4c8581e7ea7bd72a27cc888742e3f2b0e593c378" exitCode=0 Feb 19 09:55:45 crc kubenswrapper[4873]: I0219 09:55:45.646787 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" event={"ID":"14a07337-b89d-4574-aa0f-f9a3cdebdd48","Type":"ContainerDied","Data":"a6819f539c003d3080c8371d4c8581e7ea7bd72a27cc888742e3f2b0e593c378"} Feb 19 09:55:45 crc kubenswrapper[4873]: I0219 09:55:45.646815 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" event={"ID":"14a07337-b89d-4574-aa0f-f9a3cdebdd48","Type":"ContainerStarted","Data":"50044929f3c9b7b6a3f1c5020a712e49f731400273f63f27992d4494a8336eb0"} Feb 19 09:55:47 crc kubenswrapper[4873]: I0219 09:55:47.665426 4873 generic.go:334] "Generic (PLEG): container finished" podID="14a07337-b89d-4574-aa0f-f9a3cdebdd48" containerID="3d1c39529f6c4426f0fd82b65011298f8e3885e6afc1443fbf3aeb520d92c38f" exitCode=0 Feb 19 09:55:47 crc kubenswrapper[4873]: I0219 09:55:47.665510 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" event={"ID":"14a07337-b89d-4574-aa0f-f9a3cdebdd48","Type":"ContainerDied","Data":"3d1c39529f6c4426f0fd82b65011298f8e3885e6afc1443fbf3aeb520d92c38f"} Feb 19 09:55:48 crc kubenswrapper[4873]: I0219 09:55:48.675565 4873 generic.go:334] "Generic (PLEG): container finished" podID="14a07337-b89d-4574-aa0f-f9a3cdebdd48" containerID="fffba7f3fa8660d9ca2ad3df6b91fac76d1d73d78fce4e9e00e4992f32248212" exitCode=0 Feb 19 09:55:48 crc kubenswrapper[4873]: I0219 09:55:48.675671 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" event={"ID":"14a07337-b89d-4574-aa0f-f9a3cdebdd48","Type":"ContainerDied","Data":"fffba7f3fa8660d9ca2ad3df6b91fac76d1d73d78fce4e9e00e4992f32248212"} Feb 19 09:55:49 crc kubenswrapper[4873]: I0219 09:55:49.955437 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.037823 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-util\") pod \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.037998 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-bundle\") pod \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.038053 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkj5g\" (UniqueName: \"kubernetes.io/projected/14a07337-b89d-4574-aa0f-f9a3cdebdd48-kube-api-access-jkj5g\") pod \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\" (UID: \"14a07337-b89d-4574-aa0f-f9a3cdebdd48\") " Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.039921 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-bundle" (OuterVolumeSpecName: "bundle") pod "14a07337-b89d-4574-aa0f-f9a3cdebdd48" (UID: "14a07337-b89d-4574-aa0f-f9a3cdebdd48"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.046326 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a07337-b89d-4574-aa0f-f9a3cdebdd48-kube-api-access-jkj5g" (OuterVolumeSpecName: "kube-api-access-jkj5g") pod "14a07337-b89d-4574-aa0f-f9a3cdebdd48" (UID: "14a07337-b89d-4574-aa0f-f9a3cdebdd48"). InnerVolumeSpecName "kube-api-access-jkj5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.059005 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-util" (OuterVolumeSpecName: "util") pod "14a07337-b89d-4574-aa0f-f9a3cdebdd48" (UID: "14a07337-b89d-4574-aa0f-f9a3cdebdd48"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.138773 4873 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.138811 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkj5g\" (UniqueName: \"kubernetes.io/projected/14a07337-b89d-4574-aa0f-f9a3cdebdd48-kube-api-access-jkj5g\") on node \"crc\" DevicePath \"\"" Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.138823 4873 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14a07337-b89d-4574-aa0f-f9a3cdebdd48-util\") on node \"crc\" DevicePath \"\"" Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.688419 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" event={"ID":"14a07337-b89d-4574-aa0f-f9a3cdebdd48","Type":"ContainerDied","Data":"50044929f3c9b7b6a3f1c5020a712e49f731400273f63f27992d4494a8336eb0"} Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.688696 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50044929f3c9b7b6a3f1c5020a712e49f731400273f63f27992d4494a8336eb0" Feb 19 09:55:50 crc kubenswrapper[4873]: I0219 09:55:50.688533 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.451526 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qlgxw"] Feb 19 09:55:52 crc kubenswrapper[4873]: E0219 09:55:52.451912 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a07337-b89d-4574-aa0f-f9a3cdebdd48" containerName="pull" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.451940 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a07337-b89d-4574-aa0f-f9a3cdebdd48" containerName="pull" Feb 19 09:55:52 crc kubenswrapper[4873]: E0219 09:55:52.451969 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a07337-b89d-4574-aa0f-f9a3cdebdd48" containerName="util" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.451988 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a07337-b89d-4574-aa0f-f9a3cdebdd48" containerName="util" Feb 19 09:55:52 crc kubenswrapper[4873]: E0219 09:55:52.452012 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a07337-b89d-4574-aa0f-f9a3cdebdd48" containerName="extract" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.452026 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a07337-b89d-4574-aa0f-f9a3cdebdd48" containerName="extract" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.452272 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a07337-b89d-4574-aa0f-f9a3cdebdd48" containerName="extract" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.452897 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-qlgxw" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.455016 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-snb5x" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.455657 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.457182 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.475640 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qlgxw"] Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.568723 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vckxs\" (UniqueName: \"kubernetes.io/projected/f7f28c8a-4571-485c-96a2-fc1c5856e3ea-kube-api-access-vckxs\") pod \"nmstate-operator-694c9596b7-qlgxw\" (UID: \"f7f28c8a-4571-485c-96a2-fc1c5856e3ea\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qlgxw" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.670530 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vckxs\" (UniqueName: \"kubernetes.io/projected/f7f28c8a-4571-485c-96a2-fc1c5856e3ea-kube-api-access-vckxs\") pod \"nmstate-operator-694c9596b7-qlgxw\" (UID: \"f7f28c8a-4571-485c-96a2-fc1c5856e3ea\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qlgxw" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.703076 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vckxs\" (UniqueName: \"kubernetes.io/projected/f7f28c8a-4571-485c-96a2-fc1c5856e3ea-kube-api-access-vckxs\") pod \"nmstate-operator-694c9596b7-qlgxw\" (UID: \"f7f28c8a-4571-485c-96a2-fc1c5856e3ea\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qlgxw" Feb 19 09:55:52 crc kubenswrapper[4873]: I0219 09:55:52.784839 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-qlgxw" Feb 19 09:55:53 crc kubenswrapper[4873]: I0219 09:55:53.136763 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qlgxw"] Feb 19 09:55:53 crc kubenswrapper[4873]: W0219 09:55:53.150317 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7f28c8a_4571_485c_96a2_fc1c5856e3ea.slice/crio-3a58557f6a8022c3fd87a066a9c365bc8597898a03ce230f60be2b3d0e44ccf2 WatchSource:0}: Error finding container 3a58557f6a8022c3fd87a066a9c365bc8597898a03ce230f60be2b3d0e44ccf2: Status 404 returned error can't find the container with id 3a58557f6a8022c3fd87a066a9c365bc8597898a03ce230f60be2b3d0e44ccf2 Feb 19 09:55:53 crc kubenswrapper[4873]: I0219 09:55:53.712505 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-qlgxw" event={"ID":"f7f28c8a-4571-485c-96a2-fc1c5856e3ea","Type":"ContainerStarted","Data":"3a58557f6a8022c3fd87a066a9c365bc8597898a03ce230f60be2b3d0e44ccf2"} Feb 19 09:55:55 crc kubenswrapper[4873]: I0219 09:55:55.726489 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-qlgxw" event={"ID":"f7f28c8a-4571-485c-96a2-fc1c5856e3ea","Type":"ContainerStarted","Data":"bcd3218e43adeea5d5a965457865b083ce1cb88c78f8236fbba93be41cbb2f5b"} Feb 19 09:55:55 crc kubenswrapper[4873]: I0219 09:55:55.752987 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-qlgxw" podStartSLOduration=1.7699926559999999 podStartE2EDuration="3.752962488s" podCreationTimestamp="2026-02-19 09:55:52 +0000 UTC" firstStartedPulling="2026-02-19 09:55:53.153850684 +0000 UTC m=+662.443282322" lastFinishedPulling="2026-02-19 09:55:55.136820516 +0000 UTC m=+664.426252154" observedRunningTime="2026-02-19 09:55:55.750462126 +0000 UTC m=+665.039893774" watchObservedRunningTime="2026-02-19 09:55:55.752962488 +0000 UTC m=+665.042394176" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.644498 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-8jgss"] Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.645358 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8jgss" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.647134 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-hzwdm" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.661740 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-8jgss"] Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.670176 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w"] Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.671467 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.672873 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.678418 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-75txf"] Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.679586 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.705182 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w"] Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.727794 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/62408ce4-73ce-4726-91c1-96f645c39dee-dbus-socket\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.727851 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf8wr\" (UniqueName: \"kubernetes.io/projected/7af074a2-c1f7-4253-8efc-065748e0452b-kube-api-access-hf8wr\") pod \"nmstate-webhook-866bcb46dc-nfh8w\" (UID: \"7af074a2-c1f7-4253-8efc-065748e0452b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.727882 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/62408ce4-73ce-4726-91c1-96f645c39dee-nmstate-lock\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.727911 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7af074a2-c1f7-4253-8efc-065748e0452b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nfh8w\" (UID: \"7af074a2-c1f7-4253-8efc-065748e0452b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.727995 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7dlv\" (UniqueName: \"kubernetes.io/projected/62408ce4-73ce-4726-91c1-96f645c39dee-kube-api-access-q7dlv\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.728031 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m9bz\" (UniqueName: \"kubernetes.io/projected/3b960434-ef37-45ae-aa50-8d719c8e2df5-kube-api-access-4m9bz\") pod \"nmstate-metrics-58c85c668d-8jgss\" (UID: \"3b960434-ef37-45ae-aa50-8d719c8e2df5\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-8jgss" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.728062 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/62408ce4-73ce-4726-91c1-96f645c39dee-ovs-socket\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.792454 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m"] Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.800187 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.810550 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rjk6l" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.810753 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.810868 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.817643 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m"] Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.828758 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m9bz\" (UniqueName: \"kubernetes.io/projected/3b960434-ef37-45ae-aa50-8d719c8e2df5-kube-api-access-4m9bz\") pod \"nmstate-metrics-58c85c668d-8jgss\" (UID: \"3b960434-ef37-45ae-aa50-8d719c8e2df5\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-8jgss" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.828813 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b3c6348-1c17-4774-9739-7a1dd3021d81-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-9cr2m\" (UID: \"9b3c6348-1c17-4774-9739-7a1dd3021d81\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.828838 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/62408ce4-73ce-4726-91c1-96f645c39dee-ovs-socket\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.828877 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/62408ce4-73ce-4726-91c1-96f645c39dee-dbus-socket\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.828896 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf8wr\" (UniqueName: \"kubernetes.io/projected/7af074a2-c1f7-4253-8efc-065748e0452b-kube-api-access-hf8wr\") pod \"nmstate-webhook-866bcb46dc-nfh8w\" (UID: \"7af074a2-c1f7-4253-8efc-065748e0452b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.828912 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/62408ce4-73ce-4726-91c1-96f645c39dee-nmstate-lock\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.828928 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7af074a2-c1f7-4253-8efc-065748e0452b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nfh8w\" (UID: \"7af074a2-c1f7-4253-8efc-065748e0452b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.828956 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vzd9\" (UniqueName: \"kubernetes.io/projected/9b3c6348-1c17-4774-9739-7a1dd3021d81-kube-api-access-8vzd9\") pod \"nmstate-console-plugin-5c78fc5d65-9cr2m\" (UID: \"9b3c6348-1c17-4774-9739-7a1dd3021d81\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.828979 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3c6348-1c17-4774-9739-7a1dd3021d81-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-9cr2m\" (UID: \"9b3c6348-1c17-4774-9739-7a1dd3021d81\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.829004 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7dlv\" (UniqueName: \"kubernetes.io/projected/62408ce4-73ce-4726-91c1-96f645c39dee-kube-api-access-q7dlv\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.830520 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/62408ce4-73ce-4726-91c1-96f645c39dee-ovs-socket\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.830978 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/62408ce4-73ce-4726-91c1-96f645c39dee-nmstate-lock\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.831372 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/62408ce4-73ce-4726-91c1-96f645c39dee-dbus-socket\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.845811 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7af074a2-c1f7-4253-8efc-065748e0452b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nfh8w\" (UID: \"7af074a2-c1f7-4253-8efc-065748e0452b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.846653 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m9bz\" (UniqueName: \"kubernetes.io/projected/3b960434-ef37-45ae-aa50-8d719c8e2df5-kube-api-access-4m9bz\") pod \"nmstate-metrics-58c85c668d-8jgss\" (UID: \"3b960434-ef37-45ae-aa50-8d719c8e2df5\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-8jgss" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.847160 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf8wr\" (UniqueName: \"kubernetes.io/projected/7af074a2-c1f7-4253-8efc-065748e0452b-kube-api-access-hf8wr\") pod \"nmstate-webhook-866bcb46dc-nfh8w\" (UID: \"7af074a2-c1f7-4253-8efc-065748e0452b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.850611 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7dlv\" (UniqueName: \"kubernetes.io/projected/62408ce4-73ce-4726-91c1-96f645c39dee-kube-api-access-q7dlv\") pod \"nmstate-handler-75txf\" (UID: \"62408ce4-73ce-4726-91c1-96f645c39dee\") " pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.930517 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b3c6348-1c17-4774-9739-7a1dd3021d81-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-9cr2m\" (UID: \"9b3c6348-1c17-4774-9739-7a1dd3021d81\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.930613 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vzd9\" (UniqueName: \"kubernetes.io/projected/9b3c6348-1c17-4774-9739-7a1dd3021d81-kube-api-access-8vzd9\") pod \"nmstate-console-plugin-5c78fc5d65-9cr2m\" (UID: \"9b3c6348-1c17-4774-9739-7a1dd3021d81\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.930645 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3c6348-1c17-4774-9739-7a1dd3021d81-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-9cr2m\" (UID: \"9b3c6348-1c17-4774-9739-7a1dd3021d81\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:56 crc kubenswrapper[4873]: E0219 09:55:56.930779 4873 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 19 09:55:56 crc kubenswrapper[4873]: E0219 09:55:56.930836 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b3c6348-1c17-4774-9739-7a1dd3021d81-plugin-serving-cert podName:9b3c6348-1c17-4774-9739-7a1dd3021d81 nodeName:}" failed. No retries permitted until 2026-02-19 09:55:57.430816823 +0000 UTC m=+666.720248461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/9b3c6348-1c17-4774-9739-7a1dd3021d81-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-9cr2m" (UID: "9b3c6348-1c17-4774-9739-7a1dd3021d81") : secret "plugin-serving-cert" not found Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.931919 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b3c6348-1c17-4774-9739-7a1dd3021d81-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-9cr2m\" (UID: \"9b3c6348-1c17-4774-9739-7a1dd3021d81\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.952411 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vzd9\" (UniqueName: \"kubernetes.io/projected/9b3c6348-1c17-4774-9739-7a1dd3021d81-kube-api-access-8vzd9\") pod \"nmstate-console-plugin-5c78fc5d65-9cr2m\" (UID: \"9b3c6348-1c17-4774-9739-7a1dd3021d81\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.965371 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8jgss" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.972358 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-9fd68db6b-q4dk6"] Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.973182 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:56 crc kubenswrapper[4873]: I0219 09:55:56.985512 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.003331 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.031354 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e296ec6d-4270-44f2-a73e-16ca3ac286f2-console-oauth-config\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.031403 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-console-config\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.031421 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-trusted-ca-bundle\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.031438 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e296ec6d-4270-44f2-a73e-16ca3ac286f2-console-serving-cert\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.031459 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz95l\" (UniqueName: \"kubernetes.io/projected/e296ec6d-4270-44f2-a73e-16ca3ac286f2-kube-api-access-sz95l\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.031487 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-service-ca\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.031537 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-oauth-serving-cert\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.046715 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9fd68db6b-q4dk6"] Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.132842 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-oauth-serving-cert\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.133196 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e296ec6d-4270-44f2-a73e-16ca3ac286f2-console-oauth-config\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.133225 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-console-config\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.133240 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-trusted-ca-bundle\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.133254 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e296ec6d-4270-44f2-a73e-16ca3ac286f2-console-serving-cert\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.133276 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz95l\" (UniqueName: \"kubernetes.io/projected/e296ec6d-4270-44f2-a73e-16ca3ac286f2-kube-api-access-sz95l\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.133304 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-service-ca\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.134010 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-oauth-serving-cert\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.134210 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-console-config\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.134733 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-trusted-ca-bundle\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.135246 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e296ec6d-4270-44f2-a73e-16ca3ac286f2-service-ca\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.136543 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e296ec6d-4270-44f2-a73e-16ca3ac286f2-console-oauth-config\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.136663 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e296ec6d-4270-44f2-a73e-16ca3ac286f2-console-serving-cert\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.149067 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz95l\" (UniqueName: \"kubernetes.io/projected/e296ec6d-4270-44f2-a73e-16ca3ac286f2-kube-api-access-sz95l\") pod \"console-9fd68db6b-q4dk6\" (UID: \"e296ec6d-4270-44f2-a73e-16ca3ac286f2\") " pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.216738 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-8jgss"] Feb 19 09:55:57 crc kubenswrapper[4873]: W0219 09:55:57.220486 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b960434_ef37_45ae_aa50_8d719c8e2df5.slice/crio-4d2f6fab116e5c3697d562887c74f6d00a719f7bdeb4748b1a1db81da93f7e89 WatchSource:0}: Error finding container 4d2f6fab116e5c3697d562887c74f6d00a719f7bdeb4748b1a1db81da93f7e89: Status 404 returned error can't find the container with id 4d2f6fab116e5c3697d562887c74f6d00a719f7bdeb4748b1a1db81da93f7e89 Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.331134 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.441577 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3c6348-1c17-4774-9739-7a1dd3021d81-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-9cr2m\" (UID: \"9b3c6348-1c17-4774-9739-7a1dd3021d81\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.447892 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3c6348-1c17-4774-9739-7a1dd3021d81-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-9cr2m\" (UID: \"9b3c6348-1c17-4774-9739-7a1dd3021d81\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.506603 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w"] Feb 19 09:55:57 crc kubenswrapper[4873]: W0219 09:55:57.512992 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af074a2_c1f7_4253_8efc_065748e0452b.slice/crio-92b1864a7aa4bd5eace02fc56e5084af9d69850210c3ebde41dc98bddbc366d5 WatchSource:0}: Error finding container 92b1864a7aa4bd5eace02fc56e5084af9d69850210c3ebde41dc98bddbc366d5: Status 404 returned error can't find the container with id 92b1864a7aa4bd5eace02fc56e5084af9d69850210c3ebde41dc98bddbc366d5 Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.610995 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9fd68db6b-q4dk6"] Feb 19 09:55:57 crc kubenswrapper[4873]: W0219 09:55:57.618506 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode296ec6d_4270_44f2_a73e_16ca3ac286f2.slice/crio-658e1ca75f45b4f5981d224251d0cdfc634819e233906e2e2af271643db59640 WatchSource:0}: Error finding container 658e1ca75f45b4f5981d224251d0cdfc634819e233906e2e2af271643db59640: Status 404 returned error can't find the container with id 658e1ca75f45b4f5981d224251d0cdfc634819e233906e2e2af271643db59640 Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.730558 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.742341 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-75txf" event={"ID":"62408ce4-73ce-4726-91c1-96f645c39dee","Type":"ContainerStarted","Data":"e50ab46848007b49de33727979617347fac41a1bf3840c89b1336dac8785c0a3"} Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.745906 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9fd68db6b-q4dk6" event={"ID":"e296ec6d-4270-44f2-a73e-16ca3ac286f2","Type":"ContainerStarted","Data":"db2dcca52446f1249df65049e1489e4d5b3d262620fd3e3a2c380dc5f4e3673d"} Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.745970 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9fd68db6b-q4dk6" event={"ID":"e296ec6d-4270-44f2-a73e-16ca3ac286f2","Type":"ContainerStarted","Data":"658e1ca75f45b4f5981d224251d0cdfc634819e233906e2e2af271643db59640"} Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.749244 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8jgss" event={"ID":"3b960434-ef37-45ae-aa50-8d719c8e2df5","Type":"ContainerStarted","Data":"4d2f6fab116e5c3697d562887c74f6d00a719f7bdeb4748b1a1db81da93f7e89"} Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.751579 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" event={"ID":"7af074a2-c1f7-4253-8efc-065748e0452b","Type":"ContainerStarted","Data":"92b1864a7aa4bd5eace02fc56e5084af9d69850210c3ebde41dc98bddbc366d5"} Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.770874 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9fd68db6b-q4dk6" podStartSLOduration=1.770853839 podStartE2EDuration="1.770853839s" podCreationTimestamp="2026-02-19 09:55:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:55:57.770724686 +0000 UTC m=+667.060156374" watchObservedRunningTime="2026-02-19 09:55:57.770853839 +0000 UTC m=+667.060285487" Feb 19 09:55:57 crc kubenswrapper[4873]: I0219 09:55:57.964599 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m"] Feb 19 09:55:58 crc kubenswrapper[4873]: I0219 09:55:58.761082 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" event={"ID":"9b3c6348-1c17-4774-9739-7a1dd3021d81","Type":"ContainerStarted","Data":"e8543bb448a5bdde93b55c52d6b51546a92b1f5767f62d241a0fafccab300441"} Feb 19 09:56:00 crc kubenswrapper[4873]: I0219 09:56:00.779239 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8jgss" event={"ID":"3b960434-ef37-45ae-aa50-8d719c8e2df5","Type":"ContainerStarted","Data":"bdd6d3fc8ac3b1932764fb8824aba4c7a0cbe2d3ddd8e28196a8914bc6bc0fba"} Feb 19 09:56:00 crc kubenswrapper[4873]: I0219 09:56:00.781748 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" event={"ID":"9b3c6348-1c17-4774-9739-7a1dd3021d81","Type":"ContainerStarted","Data":"e8073e35b5c3f9e49c7ebc00a1c8cb948e7832a45a73476356b23d9d38e5df4b"} Feb 19 09:56:00 crc kubenswrapper[4873]: I0219 09:56:00.786157 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" event={"ID":"7af074a2-c1f7-4253-8efc-065748e0452b","Type":"ContainerStarted","Data":"62560839dff353f19ffcb6aa901023703c7e7659d45c01502ea24939023b23df"} Feb 19 09:56:00 crc kubenswrapper[4873]: I0219 09:56:00.786395 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" Feb 19 09:56:00 crc kubenswrapper[4873]: I0219 09:56:00.795038 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-75txf" event={"ID":"62408ce4-73ce-4726-91c1-96f645c39dee","Type":"ContainerStarted","Data":"c6dc947c1ab150d4460da21d597779792f8da677a79252f400414d17b9b36034"} Feb 19 09:56:00 crc kubenswrapper[4873]: I0219 09:56:00.795516 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:56:00 crc kubenswrapper[4873]: I0219 09:56:00.814506 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-9cr2m" podStartSLOduration=2.814212096 podStartE2EDuration="4.814472205s" podCreationTimestamp="2026-02-19 09:55:56 +0000 UTC" firstStartedPulling="2026-02-19 09:55:57.980075244 +0000 UTC m=+667.269506902" lastFinishedPulling="2026-02-19 09:55:59.980335373 +0000 UTC m=+669.269767011" observedRunningTime="2026-02-19 09:56:00.800671316 +0000 UTC m=+670.090103004" watchObservedRunningTime="2026-02-19 09:56:00.814472205 +0000 UTC m=+670.103903903" Feb 19 09:56:00 crc kubenswrapper[4873]: I0219 09:56:00.827677 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" podStartSLOduration=2.361713289 podStartE2EDuration="4.827650519s" podCreationTimestamp="2026-02-19 09:55:56 +0000 UTC" firstStartedPulling="2026-02-19 09:55:57.516219007 +0000 UTC m=+666.805650645" lastFinishedPulling="2026-02-19 09:55:59.982156237 +0000 UTC m=+669.271587875" observedRunningTime="2026-02-19 09:56:00.823909667 +0000 UTC m=+670.113341345" watchObservedRunningTime="2026-02-19 09:56:00.827650519 +0000 UTC m=+670.117082197" Feb 19 09:56:00 crc kubenswrapper[4873]: I0219 09:56:00.854194 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-75txf" podStartSLOduration=1.890434291 podStartE2EDuration="4.854170651s" podCreationTimestamp="2026-02-19 09:55:56 +0000 UTC" firstStartedPulling="2026-02-19 09:55:57.034545844 +0000 UTC m=+666.323977482" lastFinishedPulling="2026-02-19 09:55:59.998282164 +0000 UTC m=+669.287713842" observedRunningTime="2026-02-19 09:56:00.848778779 +0000 UTC m=+670.138210457" watchObservedRunningTime="2026-02-19 09:56:00.854170651 +0000 UTC m=+670.143602329" Feb 19 09:56:02 crc kubenswrapper[4873]: I0219 09:56:02.807267 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8jgss" event={"ID":"3b960434-ef37-45ae-aa50-8d719c8e2df5","Type":"ContainerStarted","Data":"dcd03124864bf577101a5845791573a9ded2ec7e06d8d104785cc28702a462d0"} Feb 19 09:56:02 crc kubenswrapper[4873]: I0219 09:56:02.830828 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-8jgss" podStartSLOduration=1.572252687 podStartE2EDuration="6.830807858s" podCreationTimestamp="2026-02-19 09:55:56 +0000 UTC" firstStartedPulling="2026-02-19 09:55:57.224052304 +0000 UTC m=+666.513483952" lastFinishedPulling="2026-02-19 09:56:02.482607485 +0000 UTC m=+671.772039123" observedRunningTime="2026-02-19 09:56:02.826804919 +0000 UTC m=+672.116236567" watchObservedRunningTime="2026-02-19 09:56:02.830807858 +0000 UTC m=+672.120239506" Feb 19 09:56:07 crc kubenswrapper[4873]: I0219 09:56:07.034978 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-75txf" Feb 19 09:56:07 crc kubenswrapper[4873]: I0219 09:56:07.332043 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:56:07 crc kubenswrapper[4873]: I0219 09:56:07.332145 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:56:07 crc kubenswrapper[4873]: I0219 09:56:07.338695 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:56:07 crc kubenswrapper[4873]: I0219 09:56:07.845794 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9fd68db6b-q4dk6" Feb 19 09:56:07 crc kubenswrapper[4873]: I0219 09:56:07.901997 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-shnwj"] Feb 19 09:56:16 crc kubenswrapper[4873]: I0219 09:56:16.992310 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nfh8w" Feb 19 09:56:32 crc kubenswrapper[4873]: I0219 09:56:32.961849 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-shnwj" podUID="10aa25f4-7549-468a-b42f-19305ad066dd" containerName="console" containerID="cri-o://cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381" gracePeriod=15 Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.395979 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-shnwj_10aa25f4-7549-468a-b42f-19305ad066dd/console/0.log" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.396288 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.538201 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-serving-cert\") pod \"10aa25f4-7549-468a-b42f-19305ad066dd\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.538250 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-oauth-config\") pod \"10aa25f4-7549-468a-b42f-19305ad066dd\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.538301 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ggch\" (UniqueName: \"kubernetes.io/projected/10aa25f4-7549-468a-b42f-19305ad066dd-kube-api-access-9ggch\") pod \"10aa25f4-7549-468a-b42f-19305ad066dd\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.538330 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-console-config\") pod \"10aa25f4-7549-468a-b42f-19305ad066dd\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.538420 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-service-ca\") pod \"10aa25f4-7549-468a-b42f-19305ad066dd\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.538448 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-oauth-serving-cert\") pod \"10aa25f4-7549-468a-b42f-19305ad066dd\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.538468 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-trusted-ca-bundle\") pod \"10aa25f4-7549-468a-b42f-19305ad066dd\" (UID: \"10aa25f4-7549-468a-b42f-19305ad066dd\") " Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.539137 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "10aa25f4-7549-468a-b42f-19305ad066dd" (UID: "10aa25f4-7549-468a-b42f-19305ad066dd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.539154 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-console-config" (OuterVolumeSpecName: "console-config") pod "10aa25f4-7549-468a-b42f-19305ad066dd" (UID: "10aa25f4-7549-468a-b42f-19305ad066dd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.539170 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "10aa25f4-7549-468a-b42f-19305ad066dd" (UID: "10aa25f4-7549-468a-b42f-19305ad066dd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.539197 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-service-ca" (OuterVolumeSpecName: "service-ca") pod "10aa25f4-7549-468a-b42f-19305ad066dd" (UID: "10aa25f4-7549-468a-b42f-19305ad066dd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.539606 4873 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.539632 4873 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.539643 4873 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.539655 4873 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/10aa25f4-7549-468a-b42f-19305ad066dd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.545577 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10aa25f4-7549-468a-b42f-19305ad066dd-kube-api-access-9ggch" (OuterVolumeSpecName: "kube-api-access-9ggch") pod "10aa25f4-7549-468a-b42f-19305ad066dd" (UID: "10aa25f4-7549-468a-b42f-19305ad066dd"). InnerVolumeSpecName "kube-api-access-9ggch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.546264 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "10aa25f4-7549-468a-b42f-19305ad066dd" (UID: "10aa25f4-7549-468a-b42f-19305ad066dd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.548671 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "10aa25f4-7549-468a-b42f-19305ad066dd" (UID: "10aa25f4-7549-468a-b42f-19305ad066dd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.619141 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf"] Feb 19 09:56:33 crc kubenswrapper[4873]: E0219 09:56:33.619389 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10aa25f4-7549-468a-b42f-19305ad066dd" containerName="console" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.619412 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="10aa25f4-7549-468a-b42f-19305ad066dd" containerName="console" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.619551 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="10aa25f4-7549-468a-b42f-19305ad066dd" containerName="console" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.620462 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.623089 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.627027 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf"] Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.644642 4873 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.644687 4873 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/10aa25f4-7549-468a-b42f-19305ad066dd-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.644705 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ggch\" (UniqueName: \"kubernetes.io/projected/10aa25f4-7549-468a-b42f-19305ad066dd-kube-api-access-9ggch\") on node \"crc\" DevicePath \"\"" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.745635 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.745701 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm7p5\" (UniqueName: \"kubernetes.io/projected/7a09955d-14f6-4877-bcb4-701d57165495-kube-api-access-dm7p5\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.745730 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.846975 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.847041 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm7p5\" (UniqueName: \"kubernetes.io/projected/7a09955d-14f6-4877-bcb4-701d57165495-kube-api-access-dm7p5\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.847063 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.847526 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.847818 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.869543 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm7p5\" (UniqueName: \"kubernetes.io/projected/7a09955d-14f6-4877-bcb4-701d57165495-kube-api-access-dm7p5\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:33 crc kubenswrapper[4873]: I0219 09:56:33.937917 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:34 crc kubenswrapper[4873]: I0219 09:56:34.033686 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-shnwj_10aa25f4-7549-468a-b42f-19305ad066dd/console/0.log" Feb 19 09:56:34 crc kubenswrapper[4873]: I0219 09:56:34.033756 4873 generic.go:334] "Generic (PLEG): container finished" podID="10aa25f4-7549-468a-b42f-19305ad066dd" containerID="cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381" exitCode=2 Feb 19 09:56:34 crc kubenswrapper[4873]: I0219 09:56:34.033814 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shnwj" event={"ID":"10aa25f4-7549-468a-b42f-19305ad066dd","Type":"ContainerDied","Data":"cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381"} Feb 19 09:56:34 crc kubenswrapper[4873]: I0219 09:56:34.033877 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shnwj" event={"ID":"10aa25f4-7549-468a-b42f-19305ad066dd","Type":"ContainerDied","Data":"7a581424f0da8ea44b76eb3be0d323e922f9fdfbe4bef5b6c66bc43929d92666"} Feb 19 09:56:34 crc kubenswrapper[4873]: I0219 09:56:34.033888 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shnwj" Feb 19 09:56:34 crc kubenswrapper[4873]: I0219 09:56:34.033906 4873 scope.go:117] "RemoveContainer" containerID="cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381" Feb 19 09:56:34 crc kubenswrapper[4873]: I0219 09:56:34.053909 4873 scope.go:117] "RemoveContainer" containerID="cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381" Feb 19 09:56:34 crc kubenswrapper[4873]: E0219 09:56:34.056660 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381\": container with ID starting with cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381 not found: ID does not exist" containerID="cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381" Feb 19 09:56:34 crc kubenswrapper[4873]: I0219 09:56:34.056721 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381"} err="failed to get container status \"cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381\": rpc error: code = NotFound desc = could not find container \"cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381\": container with ID starting with cc3b6e572218ab345f1c66c2372f7ae648deb477f5bc61959e0bd3585166d381 not found: ID does not exist" Feb 19 09:56:34 crc kubenswrapper[4873]: I0219 09:56:34.082955 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-shnwj"] Feb 19 09:56:34 crc kubenswrapper[4873]: I0219 09:56:34.086671 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-shnwj"] Feb 19 09:56:34 crc kubenswrapper[4873]: I0219 09:56:34.160484 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf"] Feb 19 09:56:35 crc kubenswrapper[4873]: I0219 09:56:35.041532 4873 generic.go:334] "Generic (PLEG): container finished" podID="7a09955d-14f6-4877-bcb4-701d57165495" containerID="c6bb6bf413ed07d79cee369d269cc426142baaf9a8b0bab4a28130a750e9fcc6" exitCode=0 Feb 19 09:56:35 crc kubenswrapper[4873]: I0219 09:56:35.041830 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" event={"ID":"7a09955d-14f6-4877-bcb4-701d57165495","Type":"ContainerDied","Data":"c6bb6bf413ed07d79cee369d269cc426142baaf9a8b0bab4a28130a750e9fcc6"} Feb 19 09:56:35 crc kubenswrapper[4873]: I0219 09:56:35.041851 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" event={"ID":"7a09955d-14f6-4877-bcb4-701d57165495","Type":"ContainerStarted","Data":"4b33092e42b0328d0fa22229ed7f206f1d4f098b3756047293dd64c3d91c97b7"} Feb 19 09:56:35 crc kubenswrapper[4873]: I0219 09:56:35.493451 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10aa25f4-7549-468a-b42f-19305ad066dd" path="/var/lib/kubelet/pods/10aa25f4-7549-468a-b42f-19305ad066dd/volumes" Feb 19 09:56:37 crc kubenswrapper[4873]: I0219 09:56:37.060085 4873 generic.go:334] "Generic (PLEG): container finished" podID="7a09955d-14f6-4877-bcb4-701d57165495" containerID="0d082f7d81cc31d82999718fe69b040bf30f593a9e99dca26bec10c52b3e11a4" exitCode=0 Feb 19 09:56:37 crc kubenswrapper[4873]: I0219 09:56:37.060167 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" event={"ID":"7a09955d-14f6-4877-bcb4-701d57165495","Type":"ContainerDied","Data":"0d082f7d81cc31d82999718fe69b040bf30f593a9e99dca26bec10c52b3e11a4"} Feb 19 09:56:38 crc kubenswrapper[4873]: I0219 09:56:38.072717 4873 generic.go:334] "Generic (PLEG): container finished" podID="7a09955d-14f6-4877-bcb4-701d57165495" containerID="d9fc4b1b7f625573fa52768af6d8503a3364a46ce933263c22dfdc17ed349bf8" exitCode=0 Feb 19 09:56:38 crc kubenswrapper[4873]: I0219 09:56:38.072807 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" event={"ID":"7a09955d-14f6-4877-bcb4-701d57165495","Type":"ContainerDied","Data":"d9fc4b1b7f625573fa52768af6d8503a3364a46ce933263c22dfdc17ed349bf8"} Feb 19 09:56:39 crc kubenswrapper[4873]: I0219 09:56:39.352958 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:39 crc kubenswrapper[4873]: I0219 09:56:39.530844 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm7p5\" (UniqueName: \"kubernetes.io/projected/7a09955d-14f6-4877-bcb4-701d57165495-kube-api-access-dm7p5\") pod \"7a09955d-14f6-4877-bcb4-701d57165495\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " Feb 19 09:56:39 crc kubenswrapper[4873]: I0219 09:56:39.530897 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-util\") pod \"7a09955d-14f6-4877-bcb4-701d57165495\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " Feb 19 09:56:39 crc kubenswrapper[4873]: I0219 09:56:39.530932 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-bundle\") pod \"7a09955d-14f6-4877-bcb4-701d57165495\" (UID: \"7a09955d-14f6-4877-bcb4-701d57165495\") " Feb 19 09:56:39 crc kubenswrapper[4873]: I0219 09:56:39.532447 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-bundle" (OuterVolumeSpecName: "bundle") pod "7a09955d-14f6-4877-bcb4-701d57165495" (UID: "7a09955d-14f6-4877-bcb4-701d57165495"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:56:39 crc kubenswrapper[4873]: I0219 09:56:39.538870 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a09955d-14f6-4877-bcb4-701d57165495-kube-api-access-dm7p5" (OuterVolumeSpecName: "kube-api-access-dm7p5") pod "7a09955d-14f6-4877-bcb4-701d57165495" (UID: "7a09955d-14f6-4877-bcb4-701d57165495"). InnerVolumeSpecName "kube-api-access-dm7p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:56:39 crc kubenswrapper[4873]: I0219 09:56:39.632257 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm7p5\" (UniqueName: \"kubernetes.io/projected/7a09955d-14f6-4877-bcb4-701d57165495-kube-api-access-dm7p5\") on node \"crc\" DevicePath \"\"" Feb 19 09:56:39 crc kubenswrapper[4873]: I0219 09:56:39.632471 4873 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:56:39 crc kubenswrapper[4873]: I0219 09:56:39.865800 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-util" (OuterVolumeSpecName: "util") pod "7a09955d-14f6-4877-bcb4-701d57165495" (UID: "7a09955d-14f6-4877-bcb4-701d57165495"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:56:39 crc kubenswrapper[4873]: I0219 09:56:39.935184 4873 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a09955d-14f6-4877-bcb4-701d57165495-util\") on node \"crc\" DevicePath \"\"" Feb 19 09:56:40 crc kubenswrapper[4873]: I0219 09:56:40.087786 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" event={"ID":"7a09955d-14f6-4877-bcb4-701d57165495","Type":"ContainerDied","Data":"4b33092e42b0328d0fa22229ed7f206f1d4f098b3756047293dd64c3d91c97b7"} Feb 19 09:56:40 crc kubenswrapper[4873]: I0219 09:56:40.087997 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b33092e42b0328d0fa22229ed7f206f1d4f098b3756047293dd64c3d91c97b7" Feb 19 09:56:40 crc kubenswrapper[4873]: I0219 09:56:40.087859 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf" Feb 19 09:56:48 crc kubenswrapper[4873]: I0219 09:56:48.240164 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:56:48 crc kubenswrapper[4873]: I0219 09:56:48.240822 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:56:48 crc kubenswrapper[4873]: I0219 09:56:48.992281 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6897955989-f6tl8"] Feb 19 09:56:48 crc kubenswrapper[4873]: E0219 09:56:48.992568 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a09955d-14f6-4877-bcb4-701d57165495" containerName="pull" Feb 19 09:56:48 crc kubenswrapper[4873]: I0219 09:56:48.992588 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a09955d-14f6-4877-bcb4-701d57165495" containerName="pull" Feb 19 09:56:48 crc kubenswrapper[4873]: E0219 09:56:48.992601 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a09955d-14f6-4877-bcb4-701d57165495" containerName="util" Feb 19 09:56:48 crc kubenswrapper[4873]: I0219 09:56:48.992609 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a09955d-14f6-4877-bcb4-701d57165495" containerName="util" Feb 19 09:56:48 crc kubenswrapper[4873]: E0219 09:56:48.992618 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a09955d-14f6-4877-bcb4-701d57165495" containerName="extract" Feb 19 09:56:48 crc kubenswrapper[4873]: I0219 09:56:48.992626 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a09955d-14f6-4877-bcb4-701d57165495" containerName="extract" Feb 19 09:56:48 crc kubenswrapper[4873]: I0219 09:56:48.992764 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a09955d-14f6-4877-bcb4-701d57165495" containerName="extract" Feb 19 09:56:48 crc kubenswrapper[4873]: I0219 09:56:48.993307 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:48 crc kubenswrapper[4873]: I0219 09:56:48.996815 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 19 09:56:48 crc kubenswrapper[4873]: I0219 09:56:48.999522 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.000076 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.000157 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.004776 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-jxtk7" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.006202 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6897955989-f6tl8"] Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.047860 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpdql\" (UniqueName: \"kubernetes.io/projected/94f344cf-0f09-4812-ab40-dcce7f260a53-kube-api-access-fpdql\") pod \"metallb-operator-controller-manager-6897955989-f6tl8\" (UID: \"94f344cf-0f09-4812-ab40-dcce7f260a53\") " pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.047968 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94f344cf-0f09-4812-ab40-dcce7f260a53-webhook-cert\") pod \"metallb-operator-controller-manager-6897955989-f6tl8\" (UID: \"94f344cf-0f09-4812-ab40-dcce7f260a53\") " pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.048013 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94f344cf-0f09-4812-ab40-dcce7f260a53-apiservice-cert\") pod \"metallb-operator-controller-manager-6897955989-f6tl8\" (UID: \"94f344cf-0f09-4812-ab40-dcce7f260a53\") " pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.149039 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94f344cf-0f09-4812-ab40-dcce7f260a53-apiservice-cert\") pod \"metallb-operator-controller-manager-6897955989-f6tl8\" (UID: \"94f344cf-0f09-4812-ab40-dcce7f260a53\") " pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.149328 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpdql\" (UniqueName: \"kubernetes.io/projected/94f344cf-0f09-4812-ab40-dcce7f260a53-kube-api-access-fpdql\") pod \"metallb-operator-controller-manager-6897955989-f6tl8\" (UID: \"94f344cf-0f09-4812-ab40-dcce7f260a53\") " pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.149458 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94f344cf-0f09-4812-ab40-dcce7f260a53-webhook-cert\") pod \"metallb-operator-controller-manager-6897955989-f6tl8\" (UID: \"94f344cf-0f09-4812-ab40-dcce7f260a53\") " pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.155250 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94f344cf-0f09-4812-ab40-dcce7f260a53-apiservice-cert\") pod \"metallb-operator-controller-manager-6897955989-f6tl8\" (UID: \"94f344cf-0f09-4812-ab40-dcce7f260a53\") " pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.160727 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94f344cf-0f09-4812-ab40-dcce7f260a53-webhook-cert\") pod \"metallb-operator-controller-manager-6897955989-f6tl8\" (UID: \"94f344cf-0f09-4812-ab40-dcce7f260a53\") " pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.170859 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpdql\" (UniqueName: \"kubernetes.io/projected/94f344cf-0f09-4812-ab40-dcce7f260a53-kube-api-access-fpdql\") pod \"metallb-operator-controller-manager-6897955989-f6tl8\" (UID: \"94f344cf-0f09-4812-ab40-dcce7f260a53\") " pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.252701 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph"] Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.253399 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.255911 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.255918 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.258229 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-87nwr" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.267116 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph"] Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.316446 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.454408 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9d29e18-f362-478f-911d-ed979e43aae1-webhook-cert\") pod \"metallb-operator-webhook-server-7bf7457c95-rq2ph\" (UID: \"e9d29e18-f362-478f-911d-ed979e43aae1\") " pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.454806 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9d29e18-f362-478f-911d-ed979e43aae1-apiservice-cert\") pod \"metallb-operator-webhook-server-7bf7457c95-rq2ph\" (UID: \"e9d29e18-f362-478f-911d-ed979e43aae1\") " pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.454837 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x2d4\" (UniqueName: \"kubernetes.io/projected/e9d29e18-f362-478f-911d-ed979e43aae1-kube-api-access-7x2d4\") pod \"metallb-operator-webhook-server-7bf7457c95-rq2ph\" (UID: \"e9d29e18-f362-478f-911d-ed979e43aae1\") " pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.555782 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9d29e18-f362-478f-911d-ed979e43aae1-webhook-cert\") pod \"metallb-operator-webhook-server-7bf7457c95-rq2ph\" (UID: \"e9d29e18-f362-478f-911d-ed979e43aae1\") " pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.555861 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9d29e18-f362-478f-911d-ed979e43aae1-apiservice-cert\") pod \"metallb-operator-webhook-server-7bf7457c95-rq2ph\" (UID: \"e9d29e18-f362-478f-911d-ed979e43aae1\") " pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.555893 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x2d4\" (UniqueName: \"kubernetes.io/projected/e9d29e18-f362-478f-911d-ed979e43aae1-kube-api-access-7x2d4\") pod \"metallb-operator-webhook-server-7bf7457c95-rq2ph\" (UID: \"e9d29e18-f362-478f-911d-ed979e43aae1\") " pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.570862 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e9d29e18-f362-478f-911d-ed979e43aae1-webhook-cert\") pod \"metallb-operator-webhook-server-7bf7457c95-rq2ph\" (UID: \"e9d29e18-f362-478f-911d-ed979e43aae1\") " pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.570911 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e9d29e18-f362-478f-911d-ed979e43aae1-apiservice-cert\") pod \"metallb-operator-webhook-server-7bf7457c95-rq2ph\" (UID: \"e9d29e18-f362-478f-911d-ed979e43aae1\") " pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.578481 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x2d4\" (UniqueName: \"kubernetes.io/projected/e9d29e18-f362-478f-911d-ed979e43aae1-kube-api-access-7x2d4\") pod \"metallb-operator-webhook-server-7bf7457c95-rq2ph\" (UID: \"e9d29e18-f362-478f-911d-ed979e43aae1\") " pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.760318 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6897955989-f6tl8"] Feb 19 09:56:49 crc kubenswrapper[4873]: I0219 09:56:49.864972 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:50 crc kubenswrapper[4873]: I0219 09:56:50.082686 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph"] Feb 19 09:56:50 crc kubenswrapper[4873]: W0219 09:56:50.091384 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9d29e18_f362_478f_911d_ed979e43aae1.slice/crio-87879c343f83077b31a0aa3c0154e6ab3a202da933b8ad01c407ed97ec183fc0 WatchSource:0}: Error finding container 87879c343f83077b31a0aa3c0154e6ab3a202da933b8ad01c407ed97ec183fc0: Status 404 returned error can't find the container with id 87879c343f83077b31a0aa3c0154e6ab3a202da933b8ad01c407ed97ec183fc0 Feb 19 09:56:50 crc kubenswrapper[4873]: I0219 09:56:50.208068 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" event={"ID":"e9d29e18-f362-478f-911d-ed979e43aae1","Type":"ContainerStarted","Data":"87879c343f83077b31a0aa3c0154e6ab3a202da933b8ad01c407ed97ec183fc0"} Feb 19 09:56:50 crc kubenswrapper[4873]: I0219 09:56:50.211854 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" event={"ID":"94f344cf-0f09-4812-ab40-dcce7f260a53","Type":"ContainerStarted","Data":"4a9bec928bfcffce56b0f6b8d2e124e7b036c64d825f404f8f14d209c14f6b27"} Feb 19 09:56:57 crc kubenswrapper[4873]: I0219 09:56:57.266502 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" event={"ID":"e9d29e18-f362-478f-911d-ed979e43aae1","Type":"ContainerStarted","Data":"3792233b357b79b987ea2bc0bbf51e4aad7dca62813d47d9c7c935e8601a1025"} Feb 19 09:56:57 crc kubenswrapper[4873]: I0219 09:56:57.267309 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:56:57 crc kubenswrapper[4873]: I0219 09:56:57.268425 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" event={"ID":"94f344cf-0f09-4812-ab40-dcce7f260a53","Type":"ContainerStarted","Data":"f55b046ccc9003b9acd2ec56d44041b275f3b82b6a6645715b6885bf0876fa8a"} Feb 19 09:56:57 crc kubenswrapper[4873]: I0219 09:56:57.268578 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:56:57 crc kubenswrapper[4873]: I0219 09:56:57.289020 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" podStartSLOduration=1.761949921 podStartE2EDuration="8.2890002s" podCreationTimestamp="2026-02-19 09:56:49 +0000 UTC" firstStartedPulling="2026-02-19 09:56:50.094806531 +0000 UTC m=+719.384238169" lastFinishedPulling="2026-02-19 09:56:56.62185679 +0000 UTC m=+725.911288448" observedRunningTime="2026-02-19 09:56:57.28697051 +0000 UTC m=+726.576402148" watchObservedRunningTime="2026-02-19 09:56:57.2890002 +0000 UTC m=+726.578431838" Feb 19 09:56:57 crc kubenswrapper[4873]: I0219 09:56:57.311227 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" podStartSLOduration=2.482902547 podStartE2EDuration="9.311207424s" podCreationTimestamp="2026-02-19 09:56:48 +0000 UTC" firstStartedPulling="2026-02-19 09:56:49.773745788 +0000 UTC m=+719.063177426" lastFinishedPulling="2026-02-19 09:56:56.602050655 +0000 UTC m=+725.891482303" observedRunningTime="2026-02-19 09:56:57.307942164 +0000 UTC m=+726.597373832" watchObservedRunningTime="2026-02-19 09:56:57.311207424 +0000 UTC m=+726.600639062" Feb 19 09:57:09 crc kubenswrapper[4873]: I0219 09:57:09.887354 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7bf7457c95-rq2ph" Feb 19 09:57:18 crc kubenswrapper[4873]: I0219 09:57:18.240796 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:57:18 crc kubenswrapper[4873]: I0219 09:57:18.241342 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:57:24 crc kubenswrapper[4873]: I0219 09:57:24.354640 4873 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 09:57:29 crc kubenswrapper[4873]: I0219 09:57:29.320080 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6897955989-f6tl8" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.052029 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-w8fjg"] Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.054737 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.057918 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.057957 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.058287 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gllz7" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.060979 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52"] Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.062332 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.065466 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.070480 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-frr-conf\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.070587 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-metrics\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.070639 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-reloader\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.070694 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/76ea40c9-c4a3-4a32-82a5-d725a73db80d-frr-startup\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.070744 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d8f9aee-601f-4530-876b-83709311196b-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-xwr52\" (UID: \"8d8f9aee-601f-4530-876b-83709311196b\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.070776 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-frr-sockets\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.070849 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29nj2\" (UniqueName: \"kubernetes.io/projected/76ea40c9-c4a3-4a32-82a5-d725a73db80d-kube-api-access-29nj2\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.070899 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76ea40c9-c4a3-4a32-82a5-d725a73db80d-metrics-certs\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.070939 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9lg6\" (UniqueName: \"kubernetes.io/projected/8d8f9aee-601f-4530-876b-83709311196b-kube-api-access-h9lg6\") pod \"frr-k8s-webhook-server-78b44bf5bb-xwr52\" (UID: \"8d8f9aee-601f-4530-876b-83709311196b\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.084842 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52"] Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.148746 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-phsr6"] Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.149687 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.152261 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-96f8q" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.152718 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.152798 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.152932 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.164062 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-7t964"] Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.165007 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.167037 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.174322 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76ea40c9-c4a3-4a32-82a5-d725a73db80d-metrics-certs\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.174364 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9lg6\" (UniqueName: \"kubernetes.io/projected/8d8f9aee-601f-4530-876b-83709311196b-kube-api-access-h9lg6\") pod \"frr-k8s-webhook-server-78b44bf5bb-xwr52\" (UID: \"8d8f9aee-601f-4530-876b-83709311196b\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.174389 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-frr-conf\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.174415 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-metrics\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.174432 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-reloader\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.174454 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/76ea40c9-c4a3-4a32-82a5-d725a73db80d-frr-startup\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.174488 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d8f9aee-601f-4530-876b-83709311196b-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-xwr52\" (UID: \"8d8f9aee-601f-4530-876b-83709311196b\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.174503 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-frr-sockets\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.174537 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29nj2\" (UniqueName: \"kubernetes.io/projected/76ea40c9-c4a3-4a32-82a5-d725a73db80d-kube-api-access-29nj2\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.174793 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-7t964"] Feb 19 09:57:30 crc kubenswrapper[4873]: E0219 09:57:30.174863 4873 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 19 09:57:30 crc kubenswrapper[4873]: E0219 09:57:30.174900 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76ea40c9-c4a3-4a32-82a5-d725a73db80d-metrics-certs podName:76ea40c9-c4a3-4a32-82a5-d725a73db80d nodeName:}" failed. No retries permitted until 2026-02-19 09:57:30.674887072 +0000 UTC m=+759.964318710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76ea40c9-c4a3-4a32-82a5-d725a73db80d-metrics-certs") pod "frr-k8s-w8fjg" (UID: "76ea40c9-c4a3-4a32-82a5-d725a73db80d") : secret "frr-k8s-certs-secret" not found Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.175745 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-frr-conf\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.175917 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-metrics\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.176089 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-reloader\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.176881 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/76ea40c9-c4a3-4a32-82a5-d725a73db80d-frr-startup\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.180471 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/76ea40c9-c4a3-4a32-82a5-d725a73db80d-frr-sockets\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.192801 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d8f9aee-601f-4530-876b-83709311196b-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-xwr52\" (UID: \"8d8f9aee-601f-4530-876b-83709311196b\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.202787 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9lg6\" (UniqueName: \"kubernetes.io/projected/8d8f9aee-601f-4530-876b-83709311196b-kube-api-access-h9lg6\") pod \"frr-k8s-webhook-server-78b44bf5bb-xwr52\" (UID: \"8d8f9aee-601f-4530-876b-83709311196b\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.204335 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29nj2\" (UniqueName: \"kubernetes.io/projected/76ea40c9-c4a3-4a32-82a5-d725a73db80d-kube-api-access-29nj2\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.276246 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-memberlist\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.276545 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdxtd\" (UniqueName: \"kubernetes.io/projected/46cac2a1-6c87-4c4e-a73f-92dbee290015-kube-api-access-gdxtd\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.276655 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a42b4a3-c207-40a8-80b9-0532a0ec2865-metrics-certs\") pod \"controller-69bbfbf88f-7t964\" (UID: \"4a42b4a3-c207-40a8-80b9-0532a0ec2865\") " pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.276700 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/46cac2a1-6c87-4c4e-a73f-92dbee290015-metallb-excludel2\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.276765 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-metrics-certs\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.276892 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a42b4a3-c207-40a8-80b9-0532a0ec2865-cert\") pod \"controller-69bbfbf88f-7t964\" (UID: \"4a42b4a3-c207-40a8-80b9-0532a0ec2865\") " pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.276927 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zhr5\" (UniqueName: \"kubernetes.io/projected/4a42b4a3-c207-40a8-80b9-0532a0ec2865-kube-api-access-5zhr5\") pod \"controller-69bbfbf88f-7t964\" (UID: \"4a42b4a3-c207-40a8-80b9-0532a0ec2865\") " pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.378002 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-memberlist\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.379018 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdxtd\" (UniqueName: \"kubernetes.io/projected/46cac2a1-6c87-4c4e-a73f-92dbee290015-kube-api-access-gdxtd\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.379165 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a42b4a3-c207-40a8-80b9-0532a0ec2865-metrics-certs\") pod \"controller-69bbfbf88f-7t964\" (UID: \"4a42b4a3-c207-40a8-80b9-0532a0ec2865\") " pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.379318 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/46cac2a1-6c87-4c4e-a73f-92dbee290015-metallb-excludel2\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: E0219 09:57:30.378249 4873 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 09:57:30 crc kubenswrapper[4873]: E0219 09:57:30.380406 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-memberlist podName:46cac2a1-6c87-4c4e-a73f-92dbee290015 nodeName:}" failed. No retries permitted until 2026-02-19 09:57:30.880390971 +0000 UTC m=+760.169822609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-memberlist") pod "speaker-phsr6" (UID: "46cac2a1-6c87-4c4e-a73f-92dbee290015") : secret "metallb-memberlist" not found Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.380141 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/46cac2a1-6c87-4c4e-a73f-92dbee290015-metallb-excludel2\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.380281 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-metrics-certs\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.380672 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zhr5\" (UniqueName: \"kubernetes.io/projected/4a42b4a3-c207-40a8-80b9-0532a0ec2865-kube-api-access-5zhr5\") pod \"controller-69bbfbf88f-7t964\" (UID: \"4a42b4a3-c207-40a8-80b9-0532a0ec2865\") " pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.380813 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a42b4a3-c207-40a8-80b9-0532a0ec2865-cert\") pod \"controller-69bbfbf88f-7t964\" (UID: \"4a42b4a3-c207-40a8-80b9-0532a0ec2865\") " pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.383150 4873 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.385192 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-metrics-certs\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.385759 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a42b4a3-c207-40a8-80b9-0532a0ec2865-metrics-certs\") pod \"controller-69bbfbf88f-7t964\" (UID: \"4a42b4a3-c207-40a8-80b9-0532a0ec2865\") " pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.386929 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.393815 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a42b4a3-c207-40a8-80b9-0532a0ec2865-cert\") pod \"controller-69bbfbf88f-7t964\" (UID: \"4a42b4a3-c207-40a8-80b9-0532a0ec2865\") " pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.397450 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdxtd\" (UniqueName: \"kubernetes.io/projected/46cac2a1-6c87-4c4e-a73f-92dbee290015-kube-api-access-gdxtd\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.408146 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zhr5\" (UniqueName: \"kubernetes.io/projected/4a42b4a3-c207-40a8-80b9-0532a0ec2865-kube-api-access-5zhr5\") pod \"controller-69bbfbf88f-7t964\" (UID: \"4a42b4a3-c207-40a8-80b9-0532a0ec2865\") " pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.527279 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.730935 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76ea40c9-c4a3-4a32-82a5-d725a73db80d-metrics-certs\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.734979 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76ea40c9-c4a3-4a32-82a5-d725a73db80d-metrics-certs\") pod \"frr-k8s-w8fjg\" (UID: \"76ea40c9-c4a3-4a32-82a5-d725a73db80d\") " pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.748171 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-7t964"] Feb 19 09:57:30 crc kubenswrapper[4873]: W0219 09:57:30.866183 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d8f9aee_601f_4530_876b_83709311196b.slice/crio-50fa477cf0ef00f5089eaa25ffc1a1d7d3e686d348d7cf849b1f4e4a207a301d WatchSource:0}: Error finding container 50fa477cf0ef00f5089eaa25ffc1a1d7d3e686d348d7cf849b1f4e4a207a301d: Status 404 returned error can't find the container with id 50fa477cf0ef00f5089eaa25ffc1a1d7d3e686d348d7cf849b1f4e4a207a301d Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.869801 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52"] Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.934699 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-memberlist\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:30 crc kubenswrapper[4873]: E0219 09:57:30.934915 4873 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 09:57:30 crc kubenswrapper[4873]: E0219 09:57:30.935466 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-memberlist podName:46cac2a1-6c87-4c4e-a73f-92dbee290015 nodeName:}" failed. No retries permitted until 2026-02-19 09:57:31.935433301 +0000 UTC m=+761.224864989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-memberlist") pod "speaker-phsr6" (UID: "46cac2a1-6c87-4c4e-a73f-92dbee290015") : secret "metallb-memberlist" not found Feb 19 09:57:30 crc kubenswrapper[4873]: I0219 09:57:30.977316 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:31 crc kubenswrapper[4873]: I0219 09:57:31.563968 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w8fjg" event={"ID":"76ea40c9-c4a3-4a32-82a5-d725a73db80d","Type":"ContainerStarted","Data":"502eb2e43f2f3a9f6d40162f61e2e3f680191c4f51f81e6de07fb81e9d863c57"} Feb 19 09:57:31 crc kubenswrapper[4873]: I0219 09:57:31.565709 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" event={"ID":"8d8f9aee-601f-4530-876b-83709311196b","Type":"ContainerStarted","Data":"50fa477cf0ef00f5089eaa25ffc1a1d7d3e686d348d7cf849b1f4e4a207a301d"} Feb 19 09:57:31 crc kubenswrapper[4873]: I0219 09:57:31.567674 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7t964" event={"ID":"4a42b4a3-c207-40a8-80b9-0532a0ec2865","Type":"ContainerStarted","Data":"3107787efea3747ddf6f5c4e0ed52d8064c25c5a3b90cc9169d16252e2ff64cb"} Feb 19 09:57:31 crc kubenswrapper[4873]: I0219 09:57:31.567741 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7t964" event={"ID":"4a42b4a3-c207-40a8-80b9-0532a0ec2865","Type":"ContainerStarted","Data":"9d1d12b0aa07c5f612100450fdcfc53911a4e1a3cd5b946085a91fd7c88ca5ea"} Feb 19 09:57:31 crc kubenswrapper[4873]: I0219 09:57:31.567761 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7t964" event={"ID":"4a42b4a3-c207-40a8-80b9-0532a0ec2865","Type":"ContainerStarted","Data":"1fe7e3152bc984605a86bca2cd2b9adacb5f65a6a2d4ef9f8b5fd3455326a661"} Feb 19 09:57:31 crc kubenswrapper[4873]: I0219 09:57:31.568086 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:31 crc kubenswrapper[4873]: I0219 09:57:31.593024 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-7t964" podStartSLOduration=1.593003865 podStartE2EDuration="1.593003865s" podCreationTimestamp="2026-02-19 09:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:57:31.589993971 +0000 UTC m=+760.879425649" watchObservedRunningTime="2026-02-19 09:57:31.593003865 +0000 UTC m=+760.882435533" Feb 19 09:57:31 crc kubenswrapper[4873]: I0219 09:57:31.948342 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-memberlist\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:31 crc kubenswrapper[4873]: I0219 09:57:31.974952 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/46cac2a1-6c87-4c4e-a73f-92dbee290015-memberlist\") pod \"speaker-phsr6\" (UID: \"46cac2a1-6c87-4c4e-a73f-92dbee290015\") " pod="metallb-system/speaker-phsr6" Feb 19 09:57:32 crc kubenswrapper[4873]: I0219 09:57:32.262039 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-phsr6" Feb 19 09:57:32 crc kubenswrapper[4873]: I0219 09:57:32.578815 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-phsr6" event={"ID":"46cac2a1-6c87-4c4e-a73f-92dbee290015","Type":"ContainerStarted","Data":"50341f7b27bf4e167ea6496da28c18490963bf3025a411a59219ecb858afecef"} Feb 19 09:57:33 crc kubenswrapper[4873]: I0219 09:57:33.593247 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-phsr6" event={"ID":"46cac2a1-6c87-4c4e-a73f-92dbee290015","Type":"ContainerStarted","Data":"4b7f8ef06662077054851d192bdf57df18e6a9b7678003895ef6cc22216f8e4c"} Feb 19 09:57:33 crc kubenswrapper[4873]: I0219 09:57:33.593713 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-phsr6" event={"ID":"46cac2a1-6c87-4c4e-a73f-92dbee290015","Type":"ContainerStarted","Data":"80d45fb6c8582285aad835bf9aa4b8086c2108b74e51ea444c6769f7e66cf7cc"} Feb 19 09:57:33 crc kubenswrapper[4873]: I0219 09:57:33.593763 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-phsr6" Feb 19 09:57:33 crc kubenswrapper[4873]: I0219 09:57:33.619758 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-phsr6" podStartSLOduration=3.619741033 podStartE2EDuration="3.619741033s" podCreationTimestamp="2026-02-19 09:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:57:33.618039172 +0000 UTC m=+762.907470810" watchObservedRunningTime="2026-02-19 09:57:33.619741033 +0000 UTC m=+762.909172671" Feb 19 09:57:38 crc kubenswrapper[4873]: I0219 09:57:38.640251 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" event={"ID":"8d8f9aee-601f-4530-876b-83709311196b","Type":"ContainerStarted","Data":"27db045438f34c265d79c2757d753ddcad6e64945bfa017dffe283c5406f064c"} Feb 19 09:57:38 crc kubenswrapper[4873]: I0219 09:57:38.640918 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" Feb 19 09:57:38 crc kubenswrapper[4873]: I0219 09:57:38.642640 4873 generic.go:334] "Generic (PLEG): container finished" podID="76ea40c9-c4a3-4a32-82a5-d725a73db80d" containerID="d91624a39ed8612251e78f0a46d737d8e6dae41106c6b038ee3149ba85b509d4" exitCode=0 Feb 19 09:57:38 crc kubenswrapper[4873]: I0219 09:57:38.642884 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w8fjg" event={"ID":"76ea40c9-c4a3-4a32-82a5-d725a73db80d","Type":"ContainerDied","Data":"d91624a39ed8612251e78f0a46d737d8e6dae41106c6b038ee3149ba85b509d4"} Feb 19 09:57:38 crc kubenswrapper[4873]: I0219 09:57:38.677500 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" podStartSLOduration=1.8214987580000002 podStartE2EDuration="8.677480613s" podCreationTimestamp="2026-02-19 09:57:30 +0000 UTC" firstStartedPulling="2026-02-19 09:57:30.86933551 +0000 UTC m=+760.158767168" lastFinishedPulling="2026-02-19 09:57:37.725317385 +0000 UTC m=+767.014749023" observedRunningTime="2026-02-19 09:57:38.676668783 +0000 UTC m=+767.966100431" watchObservedRunningTime="2026-02-19 09:57:38.677480613 +0000 UTC m=+767.966912251" Feb 19 09:57:39 crc kubenswrapper[4873]: I0219 09:57:39.652615 4873 generic.go:334] "Generic (PLEG): container finished" podID="76ea40c9-c4a3-4a32-82a5-d725a73db80d" containerID="b8c21074afcf79a25db06d86dc625760db617720217f370bd95bfbd4acba9b3d" exitCode=0 Feb 19 09:57:39 crc kubenswrapper[4873]: I0219 09:57:39.652730 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w8fjg" event={"ID":"76ea40c9-c4a3-4a32-82a5-d725a73db80d","Type":"ContainerDied","Data":"b8c21074afcf79a25db06d86dc625760db617720217f370bd95bfbd4acba9b3d"} Feb 19 09:57:40 crc kubenswrapper[4873]: I0219 09:57:40.532241 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-7t964" Feb 19 09:57:40 crc kubenswrapper[4873]: I0219 09:57:40.661929 4873 generic.go:334] "Generic (PLEG): container finished" podID="76ea40c9-c4a3-4a32-82a5-d725a73db80d" containerID="392986b6ec7137e2168e32255030f078fcefdf7c4c29560fdb67923521859075" exitCode=0 Feb 19 09:57:40 crc kubenswrapper[4873]: I0219 09:57:40.661977 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w8fjg" event={"ID":"76ea40c9-c4a3-4a32-82a5-d725a73db80d","Type":"ContainerDied","Data":"392986b6ec7137e2168e32255030f078fcefdf7c4c29560fdb67923521859075"} Feb 19 09:57:41 crc kubenswrapper[4873]: I0219 09:57:41.671640 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w8fjg" event={"ID":"76ea40c9-c4a3-4a32-82a5-d725a73db80d","Type":"ContainerStarted","Data":"0b85d63bbb9c52786ce94d20e2066ebd5e26704d074f3fa60b6cb61f657ea5cc"} Feb 19 09:57:41 crc kubenswrapper[4873]: I0219 09:57:41.671954 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w8fjg" event={"ID":"76ea40c9-c4a3-4a32-82a5-d725a73db80d","Type":"ContainerStarted","Data":"73f780833dc2cd9c339c87a5e837f4c9cc30253ce6d843aa55b8ae0d4928323e"} Feb 19 09:57:41 crc kubenswrapper[4873]: I0219 09:57:41.671963 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w8fjg" event={"ID":"76ea40c9-c4a3-4a32-82a5-d725a73db80d","Type":"ContainerStarted","Data":"afa4c1c85a3d2b87412bc6722f0f313aae7eeafb3d859b3acb2335770f4ccdc1"} Feb 19 09:57:41 crc kubenswrapper[4873]: I0219 09:57:41.671973 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w8fjg" event={"ID":"76ea40c9-c4a3-4a32-82a5-d725a73db80d","Type":"ContainerStarted","Data":"f1a7c0ccde0b78a20f60db7550932b73ef0837ed0d528898e01440ec1e8f1d84"} Feb 19 09:57:41 crc kubenswrapper[4873]: I0219 09:57:41.671980 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w8fjg" event={"ID":"76ea40c9-c4a3-4a32-82a5-d725a73db80d","Type":"ContainerStarted","Data":"c6843e5d224cb646753fa10b2195544eae84f655d977eaa287a3dab21b1c9cd6"} Feb 19 09:57:42 crc kubenswrapper[4873]: I0219 09:57:42.268503 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-phsr6" Feb 19 09:57:42 crc kubenswrapper[4873]: I0219 09:57:42.686558 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w8fjg" event={"ID":"76ea40c9-c4a3-4a32-82a5-d725a73db80d","Type":"ContainerStarted","Data":"6d35937371da92a07d0cb6211844c14f05a4de8ba2b7938f33a0e3bdb0a9e287"} Feb 19 09:57:42 crc kubenswrapper[4873]: I0219 09:57:42.686924 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:42 crc kubenswrapper[4873]: I0219 09:57:42.724935 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-w8fjg" podStartSLOduration=6.043018214 podStartE2EDuration="12.724741986s" podCreationTimestamp="2026-02-19 09:57:30 +0000 UTC" firstStartedPulling="2026-02-19 09:57:31.06019557 +0000 UTC m=+760.349627208" lastFinishedPulling="2026-02-19 09:57:37.741919342 +0000 UTC m=+767.031350980" observedRunningTime="2026-02-19 09:57:42.723095526 +0000 UTC m=+772.012527174" watchObservedRunningTime="2026-02-19 09:57:42.724741986 +0000 UTC m=+772.014173624" Feb 19 09:57:44 crc kubenswrapper[4873]: I0219 09:57:44.942980 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fv6j2"] Feb 19 09:57:44 crc kubenswrapper[4873]: I0219 09:57:44.944433 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fv6j2" Feb 19 09:57:44 crc kubenswrapper[4873]: I0219 09:57:44.947196 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-nl9dk" Feb 19 09:57:44 crc kubenswrapper[4873]: I0219 09:57:44.948313 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 19 09:57:44 crc kubenswrapper[4873]: I0219 09:57:44.952300 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 19 09:57:45 crc kubenswrapper[4873]: I0219 09:57:45.012369 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fv6j2"] Feb 19 09:57:45 crc kubenswrapper[4873]: I0219 09:57:45.033348 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swp6m\" (UniqueName: \"kubernetes.io/projected/4c48262d-0e66-4844-95f7-1e8daf0d1acb-kube-api-access-swp6m\") pod \"openstack-operator-index-fv6j2\" (UID: \"4c48262d-0e66-4844-95f7-1e8daf0d1acb\") " pod="openstack-operators/openstack-operator-index-fv6j2" Feb 19 09:57:45 crc kubenswrapper[4873]: I0219 09:57:45.134538 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swp6m\" (UniqueName: \"kubernetes.io/projected/4c48262d-0e66-4844-95f7-1e8daf0d1acb-kube-api-access-swp6m\") pod \"openstack-operator-index-fv6j2\" (UID: \"4c48262d-0e66-4844-95f7-1e8daf0d1acb\") " pod="openstack-operators/openstack-operator-index-fv6j2" Feb 19 09:57:45 crc kubenswrapper[4873]: I0219 09:57:45.161081 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swp6m\" (UniqueName: \"kubernetes.io/projected/4c48262d-0e66-4844-95f7-1e8daf0d1acb-kube-api-access-swp6m\") pod \"openstack-operator-index-fv6j2\" (UID: \"4c48262d-0e66-4844-95f7-1e8daf0d1acb\") " pod="openstack-operators/openstack-operator-index-fv6j2" Feb 19 09:57:45 crc kubenswrapper[4873]: I0219 09:57:45.265958 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fv6j2" Feb 19 09:57:45 crc kubenswrapper[4873]: I0219 09:57:45.535455 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fv6j2"] Feb 19 09:57:45 crc kubenswrapper[4873]: W0219 09:57:45.542495 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c48262d_0e66_4844_95f7_1e8daf0d1acb.slice/crio-5cec17efc1266c507d6d6ad9faa7f6a18e52943217e6723af9b572fe489c2d56 WatchSource:0}: Error finding container 5cec17efc1266c507d6d6ad9faa7f6a18e52943217e6723af9b572fe489c2d56: Status 404 returned error can't find the container with id 5cec17efc1266c507d6d6ad9faa7f6a18e52943217e6723af9b572fe489c2d56 Feb 19 09:57:45 crc kubenswrapper[4873]: I0219 09:57:45.714356 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fv6j2" event={"ID":"4c48262d-0e66-4844-95f7-1e8daf0d1acb","Type":"ContainerStarted","Data":"5cec17efc1266c507d6d6ad9faa7f6a18e52943217e6723af9b572fe489c2d56"} Feb 19 09:57:45 crc kubenswrapper[4873]: I0219 09:57:45.977989 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:46 crc kubenswrapper[4873]: I0219 09:57:46.024944 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.240747 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.241139 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.241204 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.241987 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"025da7fd171f987961d862fe4ebef489eca80227003392ad78806aa501904663"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.242081 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://025da7fd171f987961d862fe4ebef489eca80227003392ad78806aa501904663" gracePeriod=600 Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.319944 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-fv6j2"] Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.744609 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="025da7fd171f987961d862fe4ebef489eca80227003392ad78806aa501904663" exitCode=0 Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.744705 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"025da7fd171f987961d862fe4ebef489eca80227003392ad78806aa501904663"} Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.744766 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"4e9052ea8663914dbd7738866b6f51c9865aab9ba0562919ffd7db3fb01e7ded"} Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.744793 4873 scope.go:117] "RemoveContainer" containerID="ebff3f80b0b9d54ded2014067bb39816bc67366aec6359774e3b0cd08dfce552" Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.749260 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fv6j2" event={"ID":"4c48262d-0e66-4844-95f7-1e8daf0d1acb","Type":"ContainerStarted","Data":"e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e"} Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.796459 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fv6j2" podStartSLOduration=2.546975459 podStartE2EDuration="4.796390858s" podCreationTimestamp="2026-02-19 09:57:44 +0000 UTC" firstStartedPulling="2026-02-19 09:57:45.546219051 +0000 UTC m=+774.835650699" lastFinishedPulling="2026-02-19 09:57:47.79563446 +0000 UTC m=+777.085066098" observedRunningTime="2026-02-19 09:57:48.794587944 +0000 UTC m=+778.084019622" watchObservedRunningTime="2026-02-19 09:57:48.796390858 +0000 UTC m=+778.085822526" Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.931747 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-p62rb"] Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.933216 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p62rb" Feb 19 09:57:48 crc kubenswrapper[4873]: I0219 09:57:48.953651 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p62rb"] Feb 19 09:57:49 crc kubenswrapper[4873]: I0219 09:57:49.108453 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjgp8\" (UniqueName: \"kubernetes.io/projected/0144fe1c-ef13-4b4e-8cda-ddc72e2516bb-kube-api-access-hjgp8\") pod \"openstack-operator-index-p62rb\" (UID: \"0144fe1c-ef13-4b4e-8cda-ddc72e2516bb\") " pod="openstack-operators/openstack-operator-index-p62rb" Feb 19 09:57:49 crc kubenswrapper[4873]: I0219 09:57:49.209614 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjgp8\" (UniqueName: \"kubernetes.io/projected/0144fe1c-ef13-4b4e-8cda-ddc72e2516bb-kube-api-access-hjgp8\") pod \"openstack-operator-index-p62rb\" (UID: \"0144fe1c-ef13-4b4e-8cda-ddc72e2516bb\") " pod="openstack-operators/openstack-operator-index-p62rb" Feb 19 09:57:49 crc kubenswrapper[4873]: I0219 09:57:49.242078 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjgp8\" (UniqueName: \"kubernetes.io/projected/0144fe1c-ef13-4b4e-8cda-ddc72e2516bb-kube-api-access-hjgp8\") pod \"openstack-operator-index-p62rb\" (UID: \"0144fe1c-ef13-4b4e-8cda-ddc72e2516bb\") " pod="openstack-operators/openstack-operator-index-p62rb" Feb 19 09:57:49 crc kubenswrapper[4873]: I0219 09:57:49.293712 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p62rb" Feb 19 09:57:49 crc kubenswrapper[4873]: I0219 09:57:49.556056 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p62rb"] Feb 19 09:57:49 crc kubenswrapper[4873]: W0219 09:57:49.564061 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0144fe1c_ef13_4b4e_8cda_ddc72e2516bb.slice/crio-b4b01b5de11f95b1a6872a6bcdaeabcf004b864462980bdd22ff853a5962f07b WatchSource:0}: Error finding container b4b01b5de11f95b1a6872a6bcdaeabcf004b864462980bdd22ff853a5962f07b: Status 404 returned error can't find the container with id b4b01b5de11f95b1a6872a6bcdaeabcf004b864462980bdd22ff853a5962f07b Feb 19 09:57:49 crc kubenswrapper[4873]: I0219 09:57:49.758585 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p62rb" event={"ID":"0144fe1c-ef13-4b4e-8cda-ddc72e2516bb","Type":"ContainerStarted","Data":"b4b01b5de11f95b1a6872a6bcdaeabcf004b864462980bdd22ff853a5962f07b"} Feb 19 09:57:49 crc kubenswrapper[4873]: I0219 09:57:49.761559 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-fv6j2" podUID="4c48262d-0e66-4844-95f7-1e8daf0d1acb" containerName="registry-server" containerID="cri-o://e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e" gracePeriod=2 Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.144936 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fv6j2" Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.329658 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swp6m\" (UniqueName: \"kubernetes.io/projected/4c48262d-0e66-4844-95f7-1e8daf0d1acb-kube-api-access-swp6m\") pod \"4c48262d-0e66-4844-95f7-1e8daf0d1acb\" (UID: \"4c48262d-0e66-4844-95f7-1e8daf0d1acb\") " Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.337725 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c48262d-0e66-4844-95f7-1e8daf0d1acb-kube-api-access-swp6m" (OuterVolumeSpecName: "kube-api-access-swp6m") pod "4c48262d-0e66-4844-95f7-1e8daf0d1acb" (UID: "4c48262d-0e66-4844-95f7-1e8daf0d1acb"). InnerVolumeSpecName "kube-api-access-swp6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.398624 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-xwr52" Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.431889 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swp6m\" (UniqueName: \"kubernetes.io/projected/4c48262d-0e66-4844-95f7-1e8daf0d1acb-kube-api-access-swp6m\") on node \"crc\" DevicePath \"\"" Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.771587 4873 generic.go:334] "Generic (PLEG): container finished" podID="4c48262d-0e66-4844-95f7-1e8daf0d1acb" containerID="e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e" exitCode=0 Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.771634 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fv6j2" event={"ID":"4c48262d-0e66-4844-95f7-1e8daf0d1acb","Type":"ContainerDied","Data":"e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e"} Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.771661 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fv6j2" Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.771696 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fv6j2" event={"ID":"4c48262d-0e66-4844-95f7-1e8daf0d1acb","Type":"ContainerDied","Data":"5cec17efc1266c507d6d6ad9faa7f6a18e52943217e6723af9b572fe489c2d56"} Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.771748 4873 scope.go:117] "RemoveContainer" containerID="e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e" Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.775286 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p62rb" event={"ID":"0144fe1c-ef13-4b4e-8cda-ddc72e2516bb","Type":"ContainerStarted","Data":"c04b22b737ca79accb8d481d3d61932e9ac82753c96c288882c4f598bc0dd02b"} Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.801254 4873 scope.go:117] "RemoveContainer" containerID="e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e" Feb 19 09:57:50 crc kubenswrapper[4873]: E0219 09:57:50.804416 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e\": container with ID starting with e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e not found: ID does not exist" containerID="e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e" Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.804531 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e"} err="failed to get container status \"e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e\": rpc error: code = NotFound desc = could not find container \"e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e\": container with ID starting with e90eef3aacf035d791c4bf60c97b62867b98f358059e5b065938960b6a79ea5e not found: ID does not exist" Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.815033 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-p62rb" podStartSLOduration=2.762398067 podStartE2EDuration="2.815011417s" podCreationTimestamp="2026-02-19 09:57:48 +0000 UTC" firstStartedPulling="2026-02-19 09:57:49.568210154 +0000 UTC m=+778.857641792" lastFinishedPulling="2026-02-19 09:57:49.620823484 +0000 UTC m=+778.910255142" observedRunningTime="2026-02-19 09:57:50.796606816 +0000 UTC m=+780.086038454" watchObservedRunningTime="2026-02-19 09:57:50.815011417 +0000 UTC m=+780.104443055" Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.819619 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-fv6j2"] Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.823575 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-fv6j2"] Feb 19 09:57:50 crc kubenswrapper[4873]: I0219 09:57:50.981083 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-w8fjg" Feb 19 09:57:51 crc kubenswrapper[4873]: I0219 09:57:51.495336 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c48262d-0e66-4844-95f7-1e8daf0d1acb" path="/var/lib/kubelet/pods/4c48262d-0e66-4844-95f7-1e8daf0d1acb/volumes" Feb 19 09:57:59 crc kubenswrapper[4873]: I0219 09:57:59.294325 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-p62rb" Feb 19 09:57:59 crc kubenswrapper[4873]: I0219 09:57:59.295041 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-p62rb" Feb 19 09:57:59 crc kubenswrapper[4873]: I0219 09:57:59.336677 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-p62rb" Feb 19 09:57:59 crc kubenswrapper[4873]: I0219 09:57:59.880223 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-p62rb" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.279757 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6"] Feb 19 09:58:06 crc kubenswrapper[4873]: E0219 09:58:06.280351 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c48262d-0e66-4844-95f7-1e8daf0d1acb" containerName="registry-server" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.280367 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c48262d-0e66-4844-95f7-1e8daf0d1acb" containerName="registry-server" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.280519 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c48262d-0e66-4844-95f7-1e8daf0d1acb" containerName="registry-server" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.281695 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.285019 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-58jgl" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.293365 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6"] Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.468848 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-util\") pod \"8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.468964 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-bundle\") pod \"8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.469012 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9v8z\" (UniqueName: \"kubernetes.io/projected/78582e6c-dedc-4608-a542-6837184954ab-kube-api-access-h9v8z\") pod \"8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.570854 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-util\") pod \"8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.570965 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-bundle\") pod \"8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.571057 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9v8z\" (UniqueName: \"kubernetes.io/projected/78582e6c-dedc-4608-a542-6837184954ab-kube-api-access-h9v8z\") pod \"8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.571829 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-util\") pod \"8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.572239 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-bundle\") pod \"8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.607514 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9v8z\" (UniqueName: \"kubernetes.io/projected/78582e6c-dedc-4608-a542-6837184954ab-kube-api-access-h9v8z\") pod \"8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.608182 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.861955 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6"] Feb 19 09:58:06 crc kubenswrapper[4873]: W0219 09:58:06.872829 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78582e6c_dedc_4608_a542_6837184954ab.slice/crio-8acf5b88ea8a6168e48f70276b84b945ce041f83c51ac8f7faf5f8080114094e WatchSource:0}: Error finding container 8acf5b88ea8a6168e48f70276b84b945ce041f83c51ac8f7faf5f8080114094e: Status 404 returned error can't find the container with id 8acf5b88ea8a6168e48f70276b84b945ce041f83c51ac8f7faf5f8080114094e Feb 19 09:58:06 crc kubenswrapper[4873]: I0219 09:58:06.898438 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" event={"ID":"78582e6c-dedc-4608-a542-6837184954ab","Type":"ContainerStarted","Data":"8acf5b88ea8a6168e48f70276b84b945ce041f83c51ac8f7faf5f8080114094e"} Feb 19 09:58:07 crc kubenswrapper[4873]: I0219 09:58:07.909872 4873 generic.go:334] "Generic (PLEG): container finished" podID="78582e6c-dedc-4608-a542-6837184954ab" containerID="041b2406b02a94c54e85ab5c62fae0d9c8bb4d656ef6b673d5b415f5bffe8768" exitCode=0 Feb 19 09:58:07 crc kubenswrapper[4873]: I0219 09:58:07.909927 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" event={"ID":"78582e6c-dedc-4608-a542-6837184954ab","Type":"ContainerDied","Data":"041b2406b02a94c54e85ab5c62fae0d9c8bb4d656ef6b673d5b415f5bffe8768"} Feb 19 09:58:08 crc kubenswrapper[4873]: I0219 09:58:08.920799 4873 generic.go:334] "Generic (PLEG): container finished" podID="78582e6c-dedc-4608-a542-6837184954ab" containerID="26612d1be40fb7f50e8c040bf74b4074bc70885f62408f30440ce32acc781d7b" exitCode=0 Feb 19 09:58:08 crc kubenswrapper[4873]: I0219 09:58:08.920878 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" event={"ID":"78582e6c-dedc-4608-a542-6837184954ab","Type":"ContainerDied","Data":"26612d1be40fb7f50e8c040bf74b4074bc70885f62408f30440ce32acc781d7b"} Feb 19 09:58:09 crc kubenswrapper[4873]: I0219 09:58:09.930587 4873 generic.go:334] "Generic (PLEG): container finished" podID="78582e6c-dedc-4608-a542-6837184954ab" containerID="a4143a0970bf1987ea6ea034b639e03cf1f08f5db13229767843b84977f5b054" exitCode=0 Feb 19 09:58:09 crc kubenswrapper[4873]: I0219 09:58:09.930693 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" event={"ID":"78582e6c-dedc-4608-a542-6837184954ab","Type":"ContainerDied","Data":"a4143a0970bf1987ea6ea034b639e03cf1f08f5db13229767843b84977f5b054"} Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.387408 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.556642 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9v8z\" (UniqueName: \"kubernetes.io/projected/78582e6c-dedc-4608-a542-6837184954ab-kube-api-access-h9v8z\") pod \"78582e6c-dedc-4608-a542-6837184954ab\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.556744 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-bundle\") pod \"78582e6c-dedc-4608-a542-6837184954ab\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.556966 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-util\") pod \"78582e6c-dedc-4608-a542-6837184954ab\" (UID: \"78582e6c-dedc-4608-a542-6837184954ab\") " Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.557706 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-bundle" (OuterVolumeSpecName: "bundle") pod "78582e6c-dedc-4608-a542-6837184954ab" (UID: "78582e6c-dedc-4608-a542-6837184954ab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.563262 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78582e6c-dedc-4608-a542-6837184954ab-kube-api-access-h9v8z" (OuterVolumeSpecName: "kube-api-access-h9v8z") pod "78582e6c-dedc-4608-a542-6837184954ab" (UID: "78582e6c-dedc-4608-a542-6837184954ab"). InnerVolumeSpecName "kube-api-access-h9v8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.576280 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-util" (OuterVolumeSpecName: "util") pod "78582e6c-dedc-4608-a542-6837184954ab" (UID: "78582e6c-dedc-4608-a542-6837184954ab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.658940 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9v8z\" (UniqueName: \"kubernetes.io/projected/78582e6c-dedc-4608-a542-6837184954ab-kube-api-access-h9v8z\") on node \"crc\" DevicePath \"\"" Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.659063 4873 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.659083 4873 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78582e6c-dedc-4608-a542-6837184954ab-util\") on node \"crc\" DevicePath \"\"" Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.946986 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" event={"ID":"78582e6c-dedc-4608-a542-6837184954ab","Type":"ContainerDied","Data":"8acf5b88ea8a6168e48f70276b84b945ce041f83c51ac8f7faf5f8080114094e"} Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.947037 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8acf5b88ea8a6168e48f70276b84b945ce041f83c51ac8f7faf5f8080114094e" Feb 19 09:58:11 crc kubenswrapper[4873]: I0219 09:58:11.947069 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6" Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.476654 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx"] Feb 19 09:58:18 crc kubenswrapper[4873]: E0219 09:58:18.477309 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78582e6c-dedc-4608-a542-6837184954ab" containerName="extract" Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.477321 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="78582e6c-dedc-4608-a542-6837184954ab" containerName="extract" Feb 19 09:58:18 crc kubenswrapper[4873]: E0219 09:58:18.477334 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78582e6c-dedc-4608-a542-6837184954ab" containerName="pull" Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.477340 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="78582e6c-dedc-4608-a542-6837184954ab" containerName="pull" Feb 19 09:58:18 crc kubenswrapper[4873]: E0219 09:58:18.477359 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78582e6c-dedc-4608-a542-6837184954ab" containerName="util" Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.477369 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="78582e6c-dedc-4608-a542-6837184954ab" containerName="util" Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.477473 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="78582e6c-dedc-4608-a542-6837184954ab" containerName="extract" Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.477882 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx" Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.479740 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-r5vxp" Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.496629 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx"] Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.569708 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdg5\" (UniqueName: \"kubernetes.io/projected/e18b6851-e022-488e-bd95-27d1659f2761-kube-api-access-vgdg5\") pod \"openstack-operator-controller-init-8476bb6847-rv4sx\" (UID: \"e18b6851-e022-488e-bd95-27d1659f2761\") " pod="openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx" Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.671288 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdg5\" (UniqueName: \"kubernetes.io/projected/e18b6851-e022-488e-bd95-27d1659f2761-kube-api-access-vgdg5\") pod \"openstack-operator-controller-init-8476bb6847-rv4sx\" (UID: \"e18b6851-e022-488e-bd95-27d1659f2761\") " pod="openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx" Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.693935 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdg5\" (UniqueName: \"kubernetes.io/projected/e18b6851-e022-488e-bd95-27d1659f2761-kube-api-access-vgdg5\") pod \"openstack-operator-controller-init-8476bb6847-rv4sx\" (UID: \"e18b6851-e022-488e-bd95-27d1659f2761\") " pod="openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx" Feb 19 09:58:18 crc kubenswrapper[4873]: I0219 09:58:18.794617 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx" Feb 19 09:58:19 crc kubenswrapper[4873]: I0219 09:58:19.026420 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx"] Feb 19 09:58:20 crc kubenswrapper[4873]: I0219 09:58:20.008236 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx" event={"ID":"e18b6851-e022-488e-bd95-27d1659f2761","Type":"ContainerStarted","Data":"4f9ee02448624ca92a305341318ceb217afb0fb07dc22351c5e3bf240bb856c9"} Feb 19 09:58:24 crc kubenswrapper[4873]: I0219 09:58:24.040808 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx" event={"ID":"e18b6851-e022-488e-bd95-27d1659f2761","Type":"ContainerStarted","Data":"94ba8805ea630d1c4aa4bf2bcd537d207f79163d5a6d8dda395f986aa25f179a"} Feb 19 09:58:24 crc kubenswrapper[4873]: I0219 09:58:24.041145 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx" Feb 19 09:58:24 crc kubenswrapper[4873]: I0219 09:58:24.090433 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx" podStartSLOduration=2.111480418 podStartE2EDuration="6.090399956s" podCreationTimestamp="2026-02-19 09:58:18 +0000 UTC" firstStartedPulling="2026-02-19 09:58:19.0418682 +0000 UTC m=+808.331299838" lastFinishedPulling="2026-02-19 09:58:23.020787738 +0000 UTC m=+812.310219376" observedRunningTime="2026-02-19 09:58:24.081769544 +0000 UTC m=+813.371201202" watchObservedRunningTime="2026-02-19 09:58:24.090399956 +0000 UTC m=+813.379831664" Feb 19 09:58:28 crc kubenswrapper[4873]: I0219 09:58:28.799053 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-8476bb6847-rv4sx" Feb 19 09:59:06 crc kubenswrapper[4873]: I0219 09:59:06.946834 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw"] Feb 19 09:59:06 crc kubenswrapper[4873]: I0219 09:59:06.948089 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw" Feb 19 09:59:06 crc kubenswrapper[4873]: I0219 09:59:06.961935 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw"] Feb 19 09:59:06 crc kubenswrapper[4873]: I0219 09:59:06.971688 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf"] Feb 19 09:59:06 crc kubenswrapper[4873]: I0219 09:59:06.972762 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf" Feb 19 09:59:06 crc kubenswrapper[4873]: I0219 09:59:06.973289 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ssclt" Feb 19 09:59:06 crc kubenswrapper[4873]: I0219 09:59:06.986814 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-jnv7r" Feb 19 09:59:06 crc kubenswrapper[4873]: I0219 09:59:06.989937 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.018651 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.022695 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.033264 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-dsdww" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.035489 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.037932 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.063703 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8n42n" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.064958 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.078517 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.082471 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.083571 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.087484 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lz99m" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.094404 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.101381 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.102210 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.105981 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-24xwf" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.119385 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.131078 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4t46s"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.131992 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.138517 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4t46s"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.142186 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.142202 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4l99v" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.143159 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpc8l\" (UniqueName: \"kubernetes.io/projected/d53d2bae-fcdd-408c-9950-440e841cc035-kube-api-access-jpc8l\") pod \"barbican-operator-controller-manager-868647ff47-hqmvw\" (UID: \"d53d2bae-fcdd-408c-9950-440e841cc035\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.143232 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvsp8\" (UniqueName: \"kubernetes.io/projected/43531003-74d3-43b9-b0f5-6fca42b21975-kube-api-access-tvsp8\") pod \"glance-operator-controller-manager-77987464f4-vgxsl\" (UID: \"43531003-74d3-43b9-b0f5-6fca42b21975\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.143264 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm2n2\" (UniqueName: \"kubernetes.io/projected/2e7ca3f2-f73b-4bac-93bb-68b2518d956e-kube-api-access-lm2n2\") pod \"cinder-operator-controller-manager-5d946d989d-cx7xf\" (UID: \"2e7ca3f2-f73b-4bac-93bb-68b2518d956e\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.143317 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9bl4\" (UniqueName: \"kubernetes.io/projected/f108f6ea-4506-48bf-b948-e367078c3dce-kube-api-access-c9bl4\") pod \"designate-operator-controller-manager-6d8bf5c495-t54x9\" (UID: \"f108f6ea-4506-48bf-b948-e367078c3dce\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.158495 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.159724 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.163142 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.164552 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.165828 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2fmd9" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.167075 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-96ft2" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.174329 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.198189 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.199081 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.204959 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-szjl5" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.205262 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.213778 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.222618 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.223430 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.229338 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xb8zh" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.237233 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.238397 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.243943 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rnfvw" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.244156 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hnkj\" (UniqueName: \"kubernetes.io/projected/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-kube-api-access-8hnkj\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.244222 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpc8l\" (UniqueName: \"kubernetes.io/projected/d53d2bae-fcdd-408c-9950-440e841cc035-kube-api-access-jpc8l\") pod \"barbican-operator-controller-manager-868647ff47-hqmvw\" (UID: \"d53d2bae-fcdd-408c-9950-440e841cc035\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.244278 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvsp8\" (UniqueName: \"kubernetes.io/projected/43531003-74d3-43b9-b0f5-6fca42b21975-kube-api-access-tvsp8\") pod \"glance-operator-controller-manager-77987464f4-vgxsl\" (UID: \"43531003-74d3-43b9-b0f5-6fca42b21975\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.244303 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm2n2\" (UniqueName: \"kubernetes.io/projected/2e7ca3f2-f73b-4bac-93bb-68b2518d956e-kube-api-access-lm2n2\") pod \"cinder-operator-controller-manager-5d946d989d-cx7xf\" (UID: \"2e7ca3f2-f73b-4bac-93bb-68b2518d956e\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.244344 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvd65\" (UniqueName: \"kubernetes.io/projected/e4172fa9-b04e-4894-82d6-ec65ea92b004-kube-api-access-fvd65\") pod \"manila-operator-controller-manager-54f6768c69-t2hfl\" (UID: \"e4172fa9-b04e-4894-82d6-ec65ea92b004\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.244383 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk7gk\" (UniqueName: \"kubernetes.io/projected/ecf3484a-026e-4655-bfa8-e5292e2f62c5-kube-api-access-pk7gk\") pod \"keystone-operator-controller-manager-b4d948c87-t7mwr\" (UID: \"ecf3484a-026e-4655-bfa8-e5292e2f62c5\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.244424 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9bl4\" (UniqueName: \"kubernetes.io/projected/f108f6ea-4506-48bf-b948-e367078c3dce-kube-api-access-c9bl4\") pod \"designate-operator-controller-manager-6d8bf5c495-t54x9\" (UID: \"f108f6ea-4506-48bf-b948-e367078c3dce\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.244455 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg2nt\" (UniqueName: \"kubernetes.io/projected/8d4b6c84-e5ed-4761-b7c7-95b21da856f7-kube-api-access-fg2nt\") pod \"heat-operator-controller-manager-69f49c598c-vwx5n\" (UID: \"8d4b6c84-e5ed-4761-b7c7-95b21da856f7\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.244495 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.244514 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p4px\" (UniqueName: \"kubernetes.io/projected/2b1c8872-b310-4994-819c-a8e472d8e522-kube-api-access-9p4px\") pod \"horizon-operator-controller-manager-5b9b8895d5-r9b5b\" (UID: \"2b1c8872-b310-4994-819c-a8e472d8e522\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.248392 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.249395 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.251753 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-xtskv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.289715 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm2n2\" (UniqueName: \"kubernetes.io/projected/2e7ca3f2-f73b-4bac-93bb-68b2518d956e-kube-api-access-lm2n2\") pod \"cinder-operator-controller-manager-5d946d989d-cx7xf\" (UID: \"2e7ca3f2-f73b-4bac-93bb-68b2518d956e\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.292232 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9bl4\" (UniqueName: \"kubernetes.io/projected/f108f6ea-4506-48bf-b948-e367078c3dce-kube-api-access-c9bl4\") pod \"designate-operator-controller-manager-6d8bf5c495-t54x9\" (UID: \"f108f6ea-4506-48bf-b948-e367078c3dce\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.292731 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpc8l\" (UniqueName: \"kubernetes.io/projected/d53d2bae-fcdd-408c-9950-440e841cc035-kube-api-access-jpc8l\") pod \"barbican-operator-controller-manager-868647ff47-hqmvw\" (UID: \"d53d2bae-fcdd-408c-9950-440e841cc035\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.293510 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.296619 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvsp8\" (UniqueName: \"kubernetes.io/projected/43531003-74d3-43b9-b0f5-6fca42b21975-kube-api-access-tvsp8\") pod \"glance-operator-controller-manager-77987464f4-vgxsl\" (UID: \"43531003-74d3-43b9-b0f5-6fca42b21975\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.298335 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.304165 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.308199 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.309050 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.320576 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ft994" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.321745 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.346335 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvd65\" (UniqueName: \"kubernetes.io/projected/e4172fa9-b04e-4894-82d6-ec65ea92b004-kube-api-access-fvd65\") pod \"manila-operator-controller-manager-54f6768c69-t2hfl\" (UID: \"e4172fa9-b04e-4894-82d6-ec65ea92b004\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.346384 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5r9q\" (UniqueName: \"kubernetes.io/projected/588098b3-662f-4f6f-914c-8cb28e055ccd-kube-api-access-j5r9q\") pod \"mariadb-operator-controller-manager-6994f66f48-8v7q6\" (UID: \"588098b3-662f-4f6f-914c-8cb28e055ccd\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.346435 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk7gk\" (UniqueName: \"kubernetes.io/projected/ecf3484a-026e-4655-bfa8-e5292e2f62c5-kube-api-access-pk7gk\") pod \"keystone-operator-controller-manager-b4d948c87-t7mwr\" (UID: \"ecf3484a-026e-4655-bfa8-e5292e2f62c5\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.346454 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7c9s\" (UniqueName: \"kubernetes.io/projected/aeccf47e-b953-4036-b271-be284b9ab385-kube-api-access-b7c9s\") pod \"ironic-operator-controller-manager-554564d7fc-f86jr\" (UID: \"aeccf47e-b953-4036-b271-be284b9ab385\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.346476 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg2nt\" (UniqueName: \"kubernetes.io/projected/8d4b6c84-e5ed-4761-b7c7-95b21da856f7-kube-api-access-fg2nt\") pod \"heat-operator-controller-manager-69f49c598c-vwx5n\" (UID: \"8d4b6c84-e5ed-4761-b7c7-95b21da856f7\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.346496 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d294k\" (UniqueName: \"kubernetes.io/projected/8eec8859-f388-4d81-bbce-0433a66a1ef7-kube-api-access-d294k\") pod \"nova-operator-controller-manager-567668f5cf-n6djt\" (UID: \"8eec8859-f388-4d81-bbce-0433a66a1ef7\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.346515 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.346533 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p4px\" (UniqueName: \"kubernetes.io/projected/2b1c8872-b310-4994-819c-a8e472d8e522-kube-api-access-9p4px\") pod \"horizon-operator-controller-manager-5b9b8895d5-r9b5b\" (UID: \"2b1c8872-b310-4994-819c-a8e472d8e522\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.346558 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hnkj\" (UniqueName: \"kubernetes.io/projected/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-kube-api-access-8hnkj\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.346573 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2flmp\" (UniqueName: \"kubernetes.io/projected/c471d099-fa02-4463-9eb9-9d0f6a3832e6-kube-api-access-2flmp\") pod \"neutron-operator-controller-manager-64ddbf8bb-d6h72\" (UID: \"c471d099-fa02-4463-9eb9-9d0f6a3832e6\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" Feb 19 09:59:07 crc kubenswrapper[4873]: E0219 09:59:07.346859 4873 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:07 crc kubenswrapper[4873]: E0219 09:59:07.346906 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert podName:3ff0155f-08fd-42f5-9b31-c3b9a7cefefe nodeName:}" failed. No retries permitted until 2026-02-19 09:59:07.846889699 +0000 UTC m=+857.136321337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert") pod "infra-operator-controller-manager-79d975b745-4t46s" (UID: "3ff0155f-08fd-42f5-9b31-c3b9a7cefefe") : secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.348309 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.349031 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.351467 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-zvctd" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.353974 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.354844 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.365093 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.365133 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bjf7v" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.368869 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p4px\" (UniqueName: \"kubernetes.io/projected/2b1c8872-b310-4994-819c-a8e472d8e522-kube-api-access-9p4px\") pod \"horizon-operator-controller-manager-5b9b8895d5-r9b5b\" (UID: \"2b1c8872-b310-4994-819c-a8e472d8e522\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.371957 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk7gk\" (UniqueName: \"kubernetes.io/projected/ecf3484a-026e-4655-bfa8-e5292e2f62c5-kube-api-access-pk7gk\") pod \"keystone-operator-controller-manager-b4d948c87-t7mwr\" (UID: \"ecf3484a-026e-4655-bfa8-e5292e2f62c5\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.374077 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvd65\" (UniqueName: \"kubernetes.io/projected/e4172fa9-b04e-4894-82d6-ec65ea92b004-kube-api-access-fvd65\") pod \"manila-operator-controller-manager-54f6768c69-t2hfl\" (UID: \"e4172fa9-b04e-4894-82d6-ec65ea92b004\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.384581 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.393530 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.399666 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hnkj\" (UniqueName: \"kubernetes.io/projected/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-kube-api-access-8hnkj\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.401469 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg2nt\" (UniqueName: \"kubernetes.io/projected/8d4b6c84-e5ed-4761-b7c7-95b21da856f7-kube-api-access-fg2nt\") pod \"heat-operator-controller-manager-69f49c598c-vwx5n\" (UID: \"8d4b6c84-e5ed-4761-b7c7-95b21da856f7\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.413321 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.413495 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.421273 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.423303 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.425319 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.425803 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-lk7k4" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.441407 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.447356 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n57sh\" (UniqueName: \"kubernetes.io/projected/dc53742c-7e71-49fa-9378-b26036c80275-kube-api-access-n57sh\") pod \"ovn-operator-controller-manager-d44cf6b75-db4dr\" (UID: \"dc53742c-7e71-49fa-9378-b26036c80275\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.447395 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2flmp\" (UniqueName: \"kubernetes.io/projected/c471d099-fa02-4463-9eb9-9d0f6a3832e6-kube-api-access-2flmp\") pod \"neutron-operator-controller-manager-64ddbf8bb-d6h72\" (UID: \"c471d099-fa02-4463-9eb9-9d0f6a3832e6\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.447459 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hv8j\" (UniqueName: \"kubernetes.io/projected/080befba-c501-4f84-8644-6b9fda0d8d5f-kube-api-access-9hv8j\") pod \"octavia-operator-controller-manager-69f8888797-t9kgf\" (UID: \"080befba-c501-4f84-8644-6b9fda0d8d5f\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.447486 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5r9q\" (UniqueName: \"kubernetes.io/projected/588098b3-662f-4f6f-914c-8cb28e055ccd-kube-api-access-j5r9q\") pod \"mariadb-operator-controller-manager-6994f66f48-8v7q6\" (UID: \"588098b3-662f-4f6f-914c-8cb28e055ccd\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.447505 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.447526 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vl7m\" (UniqueName: \"kubernetes.io/projected/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-kube-api-access-9vl7m\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.447552 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7c9s\" (UniqueName: \"kubernetes.io/projected/aeccf47e-b953-4036-b271-be284b9ab385-kube-api-access-b7c9s\") pod \"ironic-operator-controller-manager-554564d7fc-f86jr\" (UID: \"aeccf47e-b953-4036-b271-be284b9ab385\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.447579 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d294k\" (UniqueName: \"kubernetes.io/projected/8eec8859-f388-4d81-bbce-0433a66a1ef7-kube-api-access-d294k\") pod \"nova-operator-controller-manager-567668f5cf-n6djt\" (UID: \"8eec8859-f388-4d81-bbce-0433a66a1ef7\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.451040 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-r74rt"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.456238 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-r74rt" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.459388 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-2zcfc" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.463857 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d294k\" (UniqueName: \"kubernetes.io/projected/8eec8859-f388-4d81-bbce-0433a66a1ef7-kube-api-access-d294k\") pod \"nova-operator-controller-manager-567668f5cf-n6djt\" (UID: \"8eec8859-f388-4d81-bbce-0433a66a1ef7\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.465811 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5r9q\" (UniqueName: \"kubernetes.io/projected/588098b3-662f-4f6f-914c-8cb28e055ccd-kube-api-access-j5r9q\") pod \"mariadb-operator-controller-manager-6994f66f48-8v7q6\" (UID: \"588098b3-662f-4f6f-914c-8cb28e055ccd\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.467956 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7c9s\" (UniqueName: \"kubernetes.io/projected/aeccf47e-b953-4036-b271-be284b9ab385-kube-api-access-b7c9s\") pod \"ironic-operator-controller-manager-554564d7fc-f86jr\" (UID: \"aeccf47e-b953-4036-b271-be284b9ab385\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.482015 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.483771 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2flmp\" (UniqueName: \"kubernetes.io/projected/c471d099-fa02-4463-9eb9-9d0f6a3832e6-kube-api-access-2flmp\") pod \"neutron-operator-controller-manager-64ddbf8bb-d6h72\" (UID: \"c471d099-fa02-4463-9eb9-9d0f6a3832e6\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.495960 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-r74rt"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.511746 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.518916 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.542371 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.543285 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.544686 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.547510 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-dh64x" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.548255 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrwkq\" (UniqueName: \"kubernetes.io/projected/74e9952e-50ef-4389-aa77-8f6e9cc790a8-kube-api-access-jrwkq\") pod \"placement-operator-controller-manager-8497b45c89-6hpwv\" (UID: \"74e9952e-50ef-4389-aa77-8f6e9cc790a8\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.548288 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.548311 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vl7m\" (UniqueName: \"kubernetes.io/projected/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-kube-api-access-9vl7m\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.548373 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n57sh\" (UniqueName: \"kubernetes.io/projected/dc53742c-7e71-49fa-9378-b26036c80275-kube-api-access-n57sh\") pod \"ovn-operator-controller-manager-d44cf6b75-db4dr\" (UID: \"dc53742c-7e71-49fa-9378-b26036c80275\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.548420 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76flf\" (UniqueName: \"kubernetes.io/projected/1f098ace-bbc4-46ee-8e72-ab65a59851eb-kube-api-access-76flf\") pod \"swift-operator-controller-manager-68f46476f-r74rt\" (UID: \"1f098ace-bbc4-46ee-8e72-ab65a59851eb\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-r74rt" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.548444 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hv8j\" (UniqueName: \"kubernetes.io/projected/080befba-c501-4f84-8644-6b9fda0d8d5f-kube-api-access-9hv8j\") pod \"octavia-operator-controller-manager-69f8888797-t9kgf\" (UID: \"080befba-c501-4f84-8644-6b9fda0d8d5f\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf" Feb 19 09:59:07 crc kubenswrapper[4873]: E0219 09:59:07.548459 4873 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:07 crc kubenswrapper[4873]: E0219 09:59:07.548508 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert podName:515c6c0c-ae00-4ae1-ab3f-e22e5a585681 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:08.048493197 +0000 UTC m=+857.337924835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" (UID: "515c6c0c-ae00-4ae1-ab3f-e22e5a585681") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.560754 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.579272 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hv8j\" (UniqueName: \"kubernetes.io/projected/080befba-c501-4f84-8644-6b9fda0d8d5f-kube-api-access-9hv8j\") pod \"octavia-operator-controller-manager-69f8888797-t9kgf\" (UID: \"080befba-c501-4f84-8644-6b9fda0d8d5f\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.586987 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.587275 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.587876 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.588644 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vl7m\" (UniqueName: \"kubernetes.io/projected/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-kube-api-access-9vl7m\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.592960 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n57sh\" (UniqueName: \"kubernetes.io/projected/dc53742c-7e71-49fa-9378-b26036c80275-kube-api-access-n57sh\") pod \"ovn-operator-controller-manager-d44cf6b75-db4dr\" (UID: \"dc53742c-7e71-49fa-9378-b26036c80275\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.634937 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2szzj"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.636765 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.640365 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2szzj"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.640813 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.644329 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-cgphr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.649299 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76flf\" (UniqueName: \"kubernetes.io/projected/1f098ace-bbc4-46ee-8e72-ab65a59851eb-kube-api-access-76flf\") pod \"swift-operator-controller-manager-68f46476f-r74rt\" (UID: \"1f098ace-bbc4-46ee-8e72-ab65a59851eb\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-r74rt" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.649342 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrwkq\" (UniqueName: \"kubernetes.io/projected/74e9952e-50ef-4389-aa77-8f6e9cc790a8-kube-api-access-jrwkq\") pod \"placement-operator-controller-manager-8497b45c89-6hpwv\" (UID: \"74e9952e-50ef-4389-aa77-8f6e9cc790a8\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.649388 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf4xn\" (UniqueName: \"kubernetes.io/projected/0e9da99c-56ee-4353-9378-c59a2c4e1608-kube-api-access-vf4xn\") pod \"telemetry-operator-controller-manager-7f45b4ff68-g22tc\" (UID: \"0e9da99c-56ee-4353-9378-c59a2c4e1608\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.656261 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.656662 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.659095 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.665434 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.677394 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6vgpb" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.695888 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76flf\" (UniqueName: \"kubernetes.io/projected/1f098ace-bbc4-46ee-8e72-ab65a59851eb-kube-api-access-76flf\") pod \"swift-operator-controller-manager-68f46476f-r74rt\" (UID: \"1f098ace-bbc4-46ee-8e72-ab65a59851eb\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-r74rt" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.696553 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrwkq\" (UniqueName: \"kubernetes.io/projected/74e9952e-50ef-4389-aa77-8f6e9cc790a8-kube-api-access-jrwkq\") pod \"placement-operator-controller-manager-8497b45c89-6hpwv\" (UID: \"74e9952e-50ef-4389-aa77-8f6e9cc790a8\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.740917 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.751971 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf4xn\" (UniqueName: \"kubernetes.io/projected/0e9da99c-56ee-4353-9378-c59a2c4e1608-kube-api-access-vf4xn\") pod \"telemetry-operator-controller-manager-7f45b4ff68-g22tc\" (UID: \"0e9da99c-56ee-4353-9378-c59a2c4e1608\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.752054 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxkq2\" (UniqueName: \"kubernetes.io/projected/e139553a-a68d-424d-95b5-9093ea05440b-kube-api-access-sxkq2\") pod \"test-operator-controller-manager-7866795846-2szzj\" (UID: \"e139553a-a68d-424d-95b5-9093ea05440b\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.752079 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj4pm\" (UniqueName: \"kubernetes.io/projected/e827e28d-ffd8-4f59-82bf-a6db1dab5413-kube-api-access-lj4pm\") pod \"watcher-operator-controller-manager-7d767c64df-hld6w\" (UID: \"e827e28d-ffd8-4f59-82bf-a6db1dab5413\") " pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.753496 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.774219 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.775125 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.777985 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.779372 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf4xn\" (UniqueName: \"kubernetes.io/projected/0e9da99c-56ee-4353-9378-c59a2c4e1608-kube-api-access-vf4xn\") pod \"telemetry-operator-controller-manager-7f45b4ff68-g22tc\" (UID: \"0e9da99c-56ee-4353-9378-c59a2c4e1608\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.782232 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.782293 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-sp6bz" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.782427 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.792657 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.793056 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-r74rt" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.832657 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.833470 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.841794 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-m78br" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.851475 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4"] Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.856546 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.856587 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.856630 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.856654 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxkq2\" (UniqueName: \"kubernetes.io/projected/e139553a-a68d-424d-95b5-9093ea05440b-kube-api-access-sxkq2\") pod \"test-operator-controller-manager-7866795846-2szzj\" (UID: \"e139553a-a68d-424d-95b5-9093ea05440b\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.856680 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj4pm\" (UniqueName: \"kubernetes.io/projected/e827e28d-ffd8-4f59-82bf-a6db1dab5413-kube-api-access-lj4pm\") pod \"watcher-operator-controller-manager-7d767c64df-hld6w\" (UID: \"e827e28d-ffd8-4f59-82bf-a6db1dab5413\") " pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.856830 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv998\" (UniqueName: \"kubernetes.io/projected/26f0a6ea-18fb-411a-b193-83938a4bbe19-kube-api-access-kv998\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:07 crc kubenswrapper[4873]: E0219 09:59:07.857399 4873 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:07 crc kubenswrapper[4873]: E0219 09:59:07.857463 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert podName:3ff0155f-08fd-42f5-9b31-c3b9a7cefefe nodeName:}" failed. No retries permitted until 2026-02-19 09:59:08.857445931 +0000 UTC m=+858.146877569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert") pod "infra-operator-controller-manager-79d975b745-4t46s" (UID: "3ff0155f-08fd-42f5-9b31-c3b9a7cefefe") : secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.888666 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj4pm\" (UniqueName: \"kubernetes.io/projected/e827e28d-ffd8-4f59-82bf-a6db1dab5413-kube-api-access-lj4pm\") pod \"watcher-operator-controller-manager-7d767c64df-hld6w\" (UID: \"e827e28d-ffd8-4f59-82bf-a6db1dab5413\") " pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.891419 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxkq2\" (UniqueName: \"kubernetes.io/projected/e139553a-a68d-424d-95b5-9093ea05440b-kube-api-access-sxkq2\") pod \"test-operator-controller-manager-7866795846-2szzj\" (UID: \"e139553a-a68d-424d-95b5-9093ea05440b\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.906924 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.958859 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.959652 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj849\" (UniqueName: \"kubernetes.io/projected/9574bff7-0aac-4a24-b69f-135ff968422e-kube-api-access-pj849\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lcnz4\" (UID: \"9574bff7-0aac-4a24-b69f-135ff968422e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.959732 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv998\" (UniqueName: \"kubernetes.io/projected/26f0a6ea-18fb-411a-b193-83938a4bbe19-kube-api-access-kv998\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.959842 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:07 crc kubenswrapper[4873]: E0219 09:59:07.959995 4873 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 09:59:07 crc kubenswrapper[4873]: E0219 09:59:07.960044 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:08.460029239 +0000 UTC m=+857.749460877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "webhook-server-cert" not found Feb 19 09:59:07 crc kubenswrapper[4873]: E0219 09:59:07.960326 4873 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 09:59:07 crc kubenswrapper[4873]: E0219 09:59:07.960447 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:08.460412649 +0000 UTC m=+857.749844277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "metrics-server-cert" not found Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.983250 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv998\" (UniqueName: \"kubernetes.io/projected/26f0a6ea-18fb-411a-b193-83938a4bbe19-kube-api-access-kv998\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:07 crc kubenswrapper[4873]: I0219 09:59:07.989064 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.042553 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.061384 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.061507 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj849\" (UniqueName: \"kubernetes.io/projected/9574bff7-0aac-4a24-b69f-135ff968422e-kube-api-access-pj849\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lcnz4\" (UID: \"9574bff7-0aac-4a24-b69f-135ff968422e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4" Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.061861 4873 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.061901 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert podName:515c6c0c-ae00-4ae1-ab3f-e22e5a585681 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:09.061889789 +0000 UTC m=+858.351321427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" (UID: "515c6c0c-ae00-4ae1-ab3f-e22e5a585681") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.089874 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj849\" (UniqueName: \"kubernetes.io/projected/9574bff7-0aac-4a24-b69f-135ff968422e-kube-api-access-pj849\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lcnz4\" (UID: \"9574bff7-0aac-4a24-b69f-135ff968422e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4" Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.091926 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9"] Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.204382 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4" Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.240823 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n"] Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.433187 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n" event={"ID":"8d4b6c84-e5ed-4761-b7c7-95b21da856f7","Type":"ContainerStarted","Data":"24d02d511b5ec009a77212c2771e31882b68069a6d07f2fa771ed35159b5004c"} Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.436812 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9" event={"ID":"f108f6ea-4506-48bf-b948-e367078c3dce","Type":"ContainerStarted","Data":"414b9f2975057bcfe166e6127938213546b3b6734d32b11fb953b8635da0d788"} Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.469614 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.469673 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.469791 4873 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.469835 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:09.469822612 +0000 UTC m=+858.759254250 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "metrics-server-cert" not found Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.470266 4873 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.470346 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:09.470327595 +0000 UTC m=+858.759759233 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "webhook-server-cert" not found Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.571971 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl"] Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.648337 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b"] Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.663864 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr"] Feb 19 09:59:08 crc kubenswrapper[4873]: W0219 09:59:08.688989 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeccf47e_b953_4036_b271_be284b9ab385.slice/crio-7cd316caba7af708ae47bcb905e8acc60aaaadb46d49e04c7c96b5d80f4a8a26 WatchSource:0}: Error finding container 7cd316caba7af708ae47bcb905e8acc60aaaadb46d49e04c7c96b5d80f4a8a26: Status 404 returned error can't find the container with id 7cd316caba7af708ae47bcb905e8acc60aaaadb46d49e04c7c96b5d80f4a8a26 Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.707541 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr"] Feb 19 09:59:08 crc kubenswrapper[4873]: W0219 09:59:08.715271 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecf3484a_026e_4655_bfa8_e5292e2f62c5.slice/crio-a9042d0e19776e93d7021eddd1ada4c68f1260fe782317c12c84ba2df0681236 WatchSource:0}: Error finding container a9042d0e19776e93d7021eddd1ada4c68f1260fe782317c12c84ba2df0681236: Status 404 returned error can't find the container with id a9042d0e19776e93d7021eddd1ada4c68f1260fe782317c12c84ba2df0681236 Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.749034 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw"] Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.763423 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl"] Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.766694 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf"] Feb 19 09:59:08 crc kubenswrapper[4873]: W0219 09:59:08.770938 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd53d2bae_fcdd_408c_9950_440e841cc035.slice/crio-506a1bc0339c1f11cfcc484268f29e2ca84f37dbd40b9c4b435882b2f63456f6 WatchSource:0}: Error finding container 506a1bc0339c1f11cfcc484268f29e2ca84f37dbd40b9c4b435882b2f63456f6: Status 404 returned error can't find the container with id 506a1bc0339c1f11cfcc484268f29e2ca84f37dbd40b9c4b435882b2f63456f6 Feb 19 09:59:08 crc kubenswrapper[4873]: W0219 09:59:08.775424 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e7ca3f2_f73b_4bac_93bb_68b2518d956e.slice/crio-adbe6fa4b0fde7878f9afdc95058300771b23557d9a2370d2af3469297f38cfe WatchSource:0}: Error finding container adbe6fa4b0fde7878f9afdc95058300771b23557d9a2370d2af3469297f38cfe: Status 404 returned error can't find the container with id adbe6fa4b0fde7878f9afdc95058300771b23557d9a2370d2af3469297f38cfe Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.780966 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt"] Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.786306 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72"] Feb 19 09:59:08 crc kubenswrapper[4873]: W0219 09:59:08.796316 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc471d099_fa02_4463_9eb9_9d0f6a3832e6.slice/crio-0a643a8cb41354501e730acc4fc6090e1c1b0a0711553dd0927d52aa65fe54e1 WatchSource:0}: Error finding container 0a643a8cb41354501e730acc4fc6090e1c1b0a0711553dd0927d52aa65fe54e1: Status 404 returned error can't find the container with id 0a643a8cb41354501e730acc4fc6090e1c1b0a0711553dd0927d52aa65fe54e1 Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.879971 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.880430 4873 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.880489 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert podName:3ff0155f-08fd-42f5-9b31-c3b9a7cefefe nodeName:}" failed. No retries permitted until 2026-02-19 09:59:10.880472292 +0000 UTC m=+860.169903920 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert") pod "infra-operator-controller-manager-79d975b745-4t46s" (UID: "3ff0155f-08fd-42f5-9b31-c3b9a7cefefe") : secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.886472 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-r74rt"] Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.898259 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf"] Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.915631 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6"] Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.926934 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv"] Feb 19 09:59:08 crc kubenswrapper[4873]: W0219 09:59:08.943215 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod588098b3_662f_4f6f_914c_8cb28e055ccd.slice/crio-f61878bf71a0ed236d0b8f082a5c8aee8e815d36fe794f05e4bd1c520b54c91f WatchSource:0}: Error finding container f61878bf71a0ed236d0b8f082a5c8aee8e815d36fe794f05e4bd1c520b54c91f: Status 404 returned error can't find the container with id f61878bf71a0ed236d0b8f082a5c8aee8e815d36fe794f05e4bd1c520b54c91f Feb 19 09:59:08 crc kubenswrapper[4873]: W0219 09:59:08.943933 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74e9952e_50ef_4389_aa77_8f6e9cc790a8.slice/crio-ea545351537f5f31065a29341387acce181aab73c2f392144210e08eaf2b03a1 WatchSource:0}: Error finding container ea545351537f5f31065a29341387acce181aab73c2f392144210e08eaf2b03a1: Status 404 returned error can't find the container with id ea545351537f5f31065a29341387acce181aab73c2f392144210e08eaf2b03a1 Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.947505 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jrwkq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-6hpwv_openstack-operators(74e9952e-50ef-4389-aa77-8f6e9cc790a8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.947748 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j5r9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-8v7q6_openstack-operators(588098b3-662f-4f6f-914c-8cb28e055ccd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.948878 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" podUID="588098b3-662f-4f6f-914c-8cb28e055ccd" Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.948914 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" podUID="74e9952e-50ef-4389-aa77-8f6e9cc790a8" Feb 19 09:59:08 crc kubenswrapper[4873]: I0219 09:59:08.955254 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr"] Feb 19 09:59:08 crc kubenswrapper[4873]: W0219 09:59:08.974320 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc53742c_7e71_49fa_9378_b26036c80275.slice/crio-07fc4356403790343018fd2b4f7c70f856ed6770b013fb903beb9e6b5b75ca9a WatchSource:0}: Error finding container 07fc4356403790343018fd2b4f7c70f856ed6770b013fb903beb9e6b5b75ca9a: Status 404 returned error can't find the container with id 07fc4356403790343018fd2b4f7c70f856ed6770b013fb903beb9e6b5b75ca9a Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.983251 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n57sh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-db4dr_openstack-operators(dc53742c-7e71-49fa-9378-b26036c80275): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 09:59:08 crc kubenswrapper[4873]: E0219 09:59:08.984481 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" podUID="dc53742c-7e71-49fa-9378-b26036c80275" Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.011077 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4"] Feb 19 09:59:09 crc kubenswrapper[4873]: W0219 09:59:09.012060 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9574bff7_0aac_4a24_b69f_135ff968422e.slice/crio-c9382da34f60f936ac014ac6367e76bc99a2b1b12eda1e174441bac949ce650d WatchSource:0}: Error finding container c9382da34f60f936ac014ac6367e76bc99a2b1b12eda1e174441bac949ce650d: Status 404 returned error can't find the container with id c9382da34f60f936ac014ac6367e76bc99a2b1b12eda1e174441bac949ce650d Feb 19 09:59:09 crc kubenswrapper[4873]: W0219 09:59:09.013197 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode827e28d_ffd8_4f59_82bf_a6db1dab5413.slice/crio-e954b7b1e5181f3f4ed58f9b10384e274c353f3c859b0d98cd0610a69a1b7f0b WatchSource:0}: Error finding container e954b7b1e5181f3f4ed58f9b10384e274c353f3c859b0d98cd0610a69a1b7f0b: Status 404 returned error can't find the container with id e954b7b1e5181f3f4ed58f9b10384e274c353f3c859b0d98cd0610a69a1b7f0b Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.018509 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w"] Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.021812 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.20:5001/openstack-k8s-operators/watcher-operator:539263f45944cd14f527defc4d55afa08e448c3e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lj4pm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7d767c64df-hld6w_openstack-operators(e827e28d-ffd8-4f59-82bf-a6db1dab5413): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.023479 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" podUID="e827e28d-ffd8-4f59-82bf-a6db1dab5413" Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.024046 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vf4xn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-g22tc_openstack-operators(0e9da99c-56ee-4353-9378-c59a2c4e1608): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 09:59:09 crc kubenswrapper[4873]: W0219 09:59:09.024391 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode139553a_a68d_424d_95b5_9093ea05440b.slice/crio-05f1995f615c41aab02673ce54add434398a6ece6a6b752715b8d014b30d3fa6 WatchSource:0}: Error finding container 05f1995f615c41aab02673ce54add434398a6ece6a6b752715b8d014b30d3fa6: Status 404 returned error can't find the container with id 05f1995f615c41aab02673ce54add434398a6ece6a6b752715b8d014b30d3fa6 Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.025291 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" podUID="0e9da99c-56ee-4353-9378-c59a2c4e1608" Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.026292 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sxkq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-2szzj_openstack-operators(e139553a-a68d-424d-95b5-9093ea05440b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.027440 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" podUID="e139553a-a68d-424d-95b5-9093ea05440b" Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.029060 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc"] Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.038806 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2szzj"] Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.083011 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.083205 4873 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.083254 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert podName:515c6c0c-ae00-4ae1-ab3f-e22e5a585681 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:11.083241599 +0000 UTC m=+860.372673237 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" (UID: "515c6c0c-ae00-4ae1-ab3f-e22e5a585681") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.449006 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4" event={"ID":"9574bff7-0aac-4a24-b69f-135ff968422e","Type":"ContainerStarted","Data":"c9382da34f60f936ac014ac6367e76bc99a2b1b12eda1e174441bac949ce650d"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.450375 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw" event={"ID":"d53d2bae-fcdd-408c-9950-440e841cc035","Type":"ContainerStarted","Data":"506a1bc0339c1f11cfcc484268f29e2ca84f37dbd40b9c4b435882b2f63456f6"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.451499 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-r74rt" event={"ID":"1f098ace-bbc4-46ee-8e72-ab65a59851eb","Type":"ContainerStarted","Data":"18ae34ca37037d850fa7d437165bdff531fac4b2fe5a4193823b7c876903af12"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.454146 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl" event={"ID":"43531003-74d3-43b9-b0f5-6fca42b21975","Type":"ContainerStarted","Data":"b7fe5f125f1c1ac76322ee1c155615ffd2c3d11b13774f9f36efff700fd0317d"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.455252 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" event={"ID":"0e9da99c-56ee-4353-9378-c59a2c4e1608","Type":"ContainerStarted","Data":"26221d1fd1110d89ce519a4fa80ad3ebb157b158c2593caf2ced1ff87c5ffebc"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.456408 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf" event={"ID":"080befba-c501-4f84-8644-6b9fda0d8d5f","Type":"ContainerStarted","Data":"706cd4e4db84d0745ce9525c4129ba96886cf7f55fc80c326270369c15ee3538"} Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.456576 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" podUID="0e9da99c-56ee-4353-9378-c59a2c4e1608" Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.458947 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" event={"ID":"74e9952e-50ef-4389-aa77-8f6e9cc790a8","Type":"ContainerStarted","Data":"ea545351537f5f31065a29341387acce181aab73c2f392144210e08eaf2b03a1"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.470218 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" event={"ID":"e4172fa9-b04e-4894-82d6-ec65ea92b004","Type":"ContainerStarted","Data":"833fcffe5cb2f21bd4409b4b6090d4db0f28b1db27a59b14fe1c322a32736e5a"} Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.471894 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" podUID="74e9952e-50ef-4389-aa77-8f6e9cc790a8" Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.473189 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" event={"ID":"8eec8859-f388-4d81-bbce-0433a66a1ef7","Type":"ContainerStarted","Data":"c8dfd3e11ceed31f615c53da9ab506380c7f11be17a86f9cc0e630bcc87e53b1"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.477233 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" event={"ID":"588098b3-662f-4f6f-914c-8cb28e055ccd","Type":"ContainerStarted","Data":"f61878bf71a0ed236d0b8f082a5c8aee8e815d36fe794f05e4bd1c520b54c91f"} Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.483035 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" podUID="588098b3-662f-4f6f-914c-8cb28e055ccd" Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.488734 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.488847 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.488966 4873 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.489007 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:11.488995299 +0000 UTC m=+860.778426937 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "webhook-server-cert" not found Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.489045 4873 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.489063 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:11.489057571 +0000 UTC m=+860.778489209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "metrics-server-cert" not found Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.504575 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.20:5001/openstack-k8s-operators/watcher-operator:539263f45944cd14f527defc4d55afa08e448c3e\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" podUID="e827e28d-ffd8-4f59-82bf-a6db1dab5413" Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.522997 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" podUID="dc53742c-7e71-49fa-9378-b26036c80275" Feb 19 09:59:09 crc kubenswrapper[4873]: E0219 09:59:09.523095 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" podUID="e139553a-a68d-424d-95b5-9093ea05440b" Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.546307 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b" event={"ID":"2b1c8872-b310-4994-819c-a8e472d8e522","Type":"ContainerStarted","Data":"615ce599ded1c085ac70ccf13569cd284845bbdb3e7da9a5ff392abf7268a6d3"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.546360 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr" event={"ID":"aeccf47e-b953-4036-b271-be284b9ab385","Type":"ContainerStarted","Data":"7cd316caba7af708ae47bcb905e8acc60aaaadb46d49e04c7c96b5d80f4a8a26"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.546377 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf" event={"ID":"2e7ca3f2-f73b-4bac-93bb-68b2518d956e","Type":"ContainerStarted","Data":"adbe6fa4b0fde7878f9afdc95058300771b23557d9a2370d2af3469297f38cfe"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.546416 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" event={"ID":"e827e28d-ffd8-4f59-82bf-a6db1dab5413","Type":"ContainerStarted","Data":"e954b7b1e5181f3f4ed58f9b10384e274c353f3c859b0d98cd0610a69a1b7f0b"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.546431 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" event={"ID":"c471d099-fa02-4463-9eb9-9d0f6a3832e6","Type":"ContainerStarted","Data":"0a643a8cb41354501e730acc4fc6090e1c1b0a0711553dd0927d52aa65fe54e1"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.546444 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr" event={"ID":"ecf3484a-026e-4655-bfa8-e5292e2f62c5","Type":"ContainerStarted","Data":"a9042d0e19776e93d7021eddd1ada4c68f1260fe782317c12c84ba2df0681236"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.546461 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" event={"ID":"e139553a-a68d-424d-95b5-9093ea05440b","Type":"ContainerStarted","Data":"05f1995f615c41aab02673ce54add434398a6ece6a6b752715b8d014b30d3fa6"} Feb 19 09:59:09 crc kubenswrapper[4873]: I0219 09:59:09.546475 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" event={"ID":"dc53742c-7e71-49fa-9378-b26036c80275","Type":"ContainerStarted","Data":"07fc4356403790343018fd2b4f7c70f856ed6770b013fb903beb9e6b5b75ca9a"} Feb 19 09:59:10 crc kubenswrapper[4873]: E0219 09:59:10.529154 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" podUID="588098b3-662f-4f6f-914c-8cb28e055ccd" Feb 19 09:59:10 crc kubenswrapper[4873]: E0219 09:59:10.529365 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" podUID="dc53742c-7e71-49fa-9378-b26036c80275" Feb 19 09:59:10 crc kubenswrapper[4873]: E0219 09:59:10.529732 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" podUID="0e9da99c-56ee-4353-9378-c59a2c4e1608" Feb 19 09:59:10 crc kubenswrapper[4873]: E0219 09:59:10.529814 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.20:5001/openstack-k8s-operators/watcher-operator:539263f45944cd14f527defc4d55afa08e448c3e\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" podUID="e827e28d-ffd8-4f59-82bf-a6db1dab5413" Feb 19 09:59:10 crc kubenswrapper[4873]: E0219 09:59:10.529822 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" podUID="e139553a-a68d-424d-95b5-9093ea05440b" Feb 19 09:59:10 crc kubenswrapper[4873]: E0219 09:59:10.531176 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" podUID="74e9952e-50ef-4389-aa77-8f6e9cc790a8" Feb 19 09:59:10 crc kubenswrapper[4873]: I0219 09:59:10.928561 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:10 crc kubenswrapper[4873]: E0219 09:59:10.928720 4873 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:10 crc kubenswrapper[4873]: E0219 09:59:10.929050 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert podName:3ff0155f-08fd-42f5-9b31-c3b9a7cefefe nodeName:}" failed. No retries permitted until 2026-02-19 09:59:14.929031916 +0000 UTC m=+864.218463554 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert") pod "infra-operator-controller-manager-79d975b745-4t46s" (UID: "3ff0155f-08fd-42f5-9b31-c3b9a7cefefe") : secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:11 crc kubenswrapper[4873]: I0219 09:59:11.132037 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:11 crc kubenswrapper[4873]: E0219 09:59:11.132297 4873 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:11 crc kubenswrapper[4873]: E0219 09:59:11.132506 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert podName:515c6c0c-ae00-4ae1-ab3f-e22e5a585681 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:15.132413998 +0000 UTC m=+864.421845636 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" (UID: "515c6c0c-ae00-4ae1-ab3f-e22e5a585681") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:11 crc kubenswrapper[4873]: E0219 09:59:11.526704 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" podUID="e139553a-a68d-424d-95b5-9093ea05440b" Feb 19 09:59:11 crc kubenswrapper[4873]: I0219 09:59:11.536613 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:11 crc kubenswrapper[4873]: I0219 09:59:11.536671 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:11 crc kubenswrapper[4873]: E0219 09:59:11.536789 4873 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 09:59:11 crc kubenswrapper[4873]: E0219 09:59:11.536827 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:15.536814455 +0000 UTC m=+864.826246093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "metrics-server-cert" not found Feb 19 09:59:11 crc kubenswrapper[4873]: E0219 09:59:11.536873 4873 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 09:59:11 crc kubenswrapper[4873]: E0219 09:59:11.536901 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:15.536892276 +0000 UTC m=+864.826323914 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "webhook-server-cert" not found Feb 19 09:59:14 crc kubenswrapper[4873]: I0219 09:59:14.991464 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:14 crc kubenswrapper[4873]: E0219 09:59:14.991646 4873 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:14 crc kubenswrapper[4873]: E0219 09:59:14.991982 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert podName:3ff0155f-08fd-42f5-9b31-c3b9a7cefefe nodeName:}" failed. No retries permitted until 2026-02-19 09:59:22.991963901 +0000 UTC m=+872.281395539 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert") pod "infra-operator-controller-manager-79d975b745-4t46s" (UID: "3ff0155f-08fd-42f5-9b31-c3b9a7cefefe") : secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:15 crc kubenswrapper[4873]: I0219 09:59:15.194907 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:15 crc kubenswrapper[4873]: E0219 09:59:15.195078 4873 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:15 crc kubenswrapper[4873]: E0219 09:59:15.195393 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert podName:515c6c0c-ae00-4ae1-ab3f-e22e5a585681 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:23.195374644 +0000 UTC m=+872.484806282 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" (UID: "515c6c0c-ae00-4ae1-ab3f-e22e5a585681") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:15 crc kubenswrapper[4873]: I0219 09:59:15.606791 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:15 crc kubenswrapper[4873]: I0219 09:59:15.606915 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:15 crc kubenswrapper[4873]: E0219 09:59:15.606976 4873 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 09:59:15 crc kubenswrapper[4873]: E0219 09:59:15.607074 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:23.60705554 +0000 UTC m=+872.896487178 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "webhook-server-cert" not found Feb 19 09:59:15 crc kubenswrapper[4873]: E0219 09:59:15.607161 4873 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 09:59:15 crc kubenswrapper[4873]: E0219 09:59:15.607291 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:23.607268215 +0000 UTC m=+872.896699873 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "metrics-server-cert" not found Feb 19 09:59:20 crc kubenswrapper[4873]: E0219 09:59:20.636408 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 19 09:59:20 crc kubenswrapper[4873]: E0219 09:59:20.637156 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2flmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-d6h72_openstack-operators(c471d099-fa02-4463-9eb9-9d0f6a3832e6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:59:20 crc kubenswrapper[4873]: E0219 09:59:20.638334 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" podUID="c471d099-fa02-4463-9eb9-9d0f6a3832e6" Feb 19 09:59:21 crc kubenswrapper[4873]: E0219 09:59:21.361337 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c" Feb 19 09:59:21 crc kubenswrapper[4873]: E0219 09:59:21.361692 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fvd65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-t2hfl_openstack-operators(e4172fa9-b04e-4894-82d6-ec65ea92b004): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:59:21 crc kubenswrapper[4873]: E0219 09:59:21.362888 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" podUID="e4172fa9-b04e-4894-82d6-ec65ea92b004" Feb 19 09:59:21 crc kubenswrapper[4873]: E0219 09:59:21.594733 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" podUID="e4172fa9-b04e-4894-82d6-ec65ea92b004" Feb 19 09:59:21 crc kubenswrapper[4873]: E0219 09:59:21.597085 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" podUID="c471d099-fa02-4463-9eb9-9d0f6a3832e6" Feb 19 09:59:22 crc kubenswrapper[4873]: I0219 09:59:22.486010 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.005743 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.005985 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d294k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-n6djt_openstack-operators(8eec8859-f388-4d81-bbce-0433a66a1ef7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.007403 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" podUID="8eec8859-f388-4d81-bbce-0433a66a1ef7" Feb 19 09:59:23 crc kubenswrapper[4873]: I0219 09:59:23.040543 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.040680 4873 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.040797 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert podName:3ff0155f-08fd-42f5-9b31-c3b9a7cefefe nodeName:}" failed. No retries permitted until 2026-02-19 09:59:39.040759713 +0000 UTC m=+888.330191361 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert") pod "infra-operator-controller-manager-79d975b745-4t46s" (UID: "3ff0155f-08fd-42f5-9b31-c3b9a7cefefe") : secret "infra-operator-webhook-server-cert" not found Feb 19 09:59:23 crc kubenswrapper[4873]: I0219 09:59:23.246061 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.246246 4873 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.246599 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert podName:515c6c0c-ae00-4ae1-ab3f-e22e5a585681 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:39.246575796 +0000 UTC m=+888.536007434 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" (UID: "515c6c0c-ae00-4ae1-ab3f-e22e5a585681") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.608745 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" podUID="8eec8859-f388-4d81-bbce-0433a66a1ef7" Feb 19 09:59:23 crc kubenswrapper[4873]: I0219 09:59:23.651334 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:23 crc kubenswrapper[4873]: I0219 09:59:23.651398 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.651547 4873 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.651569 4873 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.651626 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:39.651611127 +0000 UTC m=+888.941042765 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "metrics-server-cert" not found Feb 19 09:59:23 crc kubenswrapper[4873]: E0219 09:59:23.651641 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs podName:26f0a6ea-18fb-411a-b193-83938a4bbe19 nodeName:}" failed. No retries permitted until 2026-02-19 09:59:39.651634328 +0000 UTC m=+888.941065966 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs") pod "openstack-operator-controller-manager-77c7c45f98-q8khx" (UID: "26f0a6ea-18fb-411a-b193-83938a4bbe19") : secret "webhook-server-cert" not found Feb 19 09:59:28 crc kubenswrapper[4873]: E0219 09:59:28.220348 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 19 09:59:28 crc kubenswrapper[4873]: E0219 09:59:28.220760 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pj849,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-lcnz4_openstack-operators(9574bff7-0aac-4a24-b69f-135ff968422e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:59:28 crc kubenswrapper[4873]: E0219 09:59:28.224118 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4" podUID="9574bff7-0aac-4a24-b69f-135ff968422e" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.663772 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9" event={"ID":"f108f6ea-4506-48bf-b948-e367078c3dce","Type":"ContainerStarted","Data":"29d931c2988d09b27a1e6f611064cdd823148838b7ad43fa11d4fc9b9b8f0f29"} Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.664603 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.674567 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr" event={"ID":"ecf3484a-026e-4655-bfa8-e5292e2f62c5","Type":"ContainerStarted","Data":"d67de41f65a59288d5d6e22393c79582de02bf8df2658fdddf301f0a838f53ba"} Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.675165 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.681743 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n" event={"ID":"8d4b6c84-e5ed-4761-b7c7-95b21da856f7","Type":"ContainerStarted","Data":"79e6f80de933ceb5722e0c6ebbbb62ad1cfbf85954dd07be035079693509ffc7"} Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.681999 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.682870 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9" podStartSLOduration=7.811742368 podStartE2EDuration="22.682856256s" podCreationTimestamp="2026-02-19 09:59:06 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.110396515 +0000 UTC m=+857.399828153" lastFinishedPulling="2026-02-19 09:59:22.981510403 +0000 UTC m=+872.270942041" observedRunningTime="2026-02-19 09:59:28.681611415 +0000 UTC m=+877.971043053" watchObservedRunningTime="2026-02-19 09:59:28.682856256 +0000 UTC m=+877.972287894" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.683065 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-r74rt" event={"ID":"1f098ace-bbc4-46ee-8e72-ab65a59851eb","Type":"ContainerStarted","Data":"7edecaa3a0e21c14a282d96f052579159fde142b47d7bef35a09855e261458b3"} Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.683644 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-r74rt" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.690062 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf" event={"ID":"2e7ca3f2-f73b-4bac-93bb-68b2518d956e","Type":"ContainerStarted","Data":"1826e72e9b065f61f4e2df81369c7ecd90280acf1c40019caaee869094c89687"} Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.690255 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.707196 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b" event={"ID":"2b1c8872-b310-4994-819c-a8e472d8e522","Type":"ContainerStarted","Data":"98e29629da093cb432a8dfd1c23886c00b3cbf4873382f26efa14b76c8149a8f"} Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.707342 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.713808 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr" event={"ID":"aeccf47e-b953-4036-b271-be284b9ab385","Type":"ContainerStarted","Data":"96e2006a4998e860d5bf286406a428397babc6f1c17e7781521102fcc8a8d198"} Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.713961 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.719713 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr" podStartSLOduration=3.305696685 podStartE2EDuration="21.719701494s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.731328707 +0000 UTC m=+858.020760345" lastFinishedPulling="2026-02-19 09:59:27.145333476 +0000 UTC m=+876.434765154" observedRunningTime="2026-02-19 09:59:28.718528395 +0000 UTC m=+878.007960053" watchObservedRunningTime="2026-02-19 09:59:28.719701494 +0000 UTC m=+878.009133132" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.734656 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf" event={"ID":"080befba-c501-4f84-8644-6b9fda0d8d5f","Type":"ContainerStarted","Data":"a0dcc3dc7a4595ec37eb7b12925d55cd915d9823fe599d69974aadaedae2f691"} Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.734720 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.746396 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw" event={"ID":"d53d2bae-fcdd-408c-9950-440e841cc035","Type":"ContainerStarted","Data":"c4e08070c9942f76bd9fad5c909640a0b97649c29a8681bc95f5b4226a48f44f"} Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.747182 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.750703 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr" podStartSLOduration=7.46964516 podStartE2EDuration="21.750685127s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.700337743 +0000 UTC m=+857.989769381" lastFinishedPulling="2026-02-19 09:59:22.98137771 +0000 UTC m=+872.270809348" observedRunningTime="2026-02-19 09:59:28.74594388 +0000 UTC m=+878.035375518" watchObservedRunningTime="2026-02-19 09:59:28.750685127 +0000 UTC m=+878.040116765" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.765763 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl" event={"ID":"43531003-74d3-43b9-b0f5-6fca42b21975","Type":"ContainerStarted","Data":"1244d5df9af620976d080b9e36a7e9ca024ee40b4ceb2fe45b43c0af83c3d05e"} Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.765805 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl" Feb 19 09:59:28 crc kubenswrapper[4873]: E0219 09:59:28.770295 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4" podUID="9574bff7-0aac-4a24-b69f-135ff968422e" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.775575 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b" podStartSLOduration=7.4708913710000004 podStartE2EDuration="21.77555846s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.675667925 +0000 UTC m=+857.965099563" lastFinishedPulling="2026-02-19 09:59:22.980335014 +0000 UTC m=+872.269766652" observedRunningTime="2026-02-19 09:59:28.772982887 +0000 UTC m=+878.062414525" watchObservedRunningTime="2026-02-19 09:59:28.77555846 +0000 UTC m=+878.064990098" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.803809 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n" podStartSLOduration=3.984579821 podStartE2EDuration="22.803791846s" podCreationTimestamp="2026-02-19 09:59:06 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.324184843 +0000 UTC m=+857.613616481" lastFinishedPulling="2026-02-19 09:59:27.143396818 +0000 UTC m=+876.432828506" observedRunningTime="2026-02-19 09:59:28.801401717 +0000 UTC m=+878.090833355" watchObservedRunningTime="2026-02-19 09:59:28.803791846 +0000 UTC m=+878.093223474" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.827399 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf" podStartSLOduration=8.628237799 podStartE2EDuration="22.827381047s" podCreationTimestamp="2026-02-19 09:59:06 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.781475063 +0000 UTC m=+858.070906701" lastFinishedPulling="2026-02-19 09:59:22.980618311 +0000 UTC m=+872.270049949" observedRunningTime="2026-02-19 09:59:28.823088102 +0000 UTC m=+878.112519740" watchObservedRunningTime="2026-02-19 09:59:28.827381047 +0000 UTC m=+878.116812675" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.847459 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-r74rt" podStartSLOduration=7.797441279 podStartE2EDuration="21.847445732s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.931326416 +0000 UTC m=+858.220758054" lastFinishedPulling="2026-02-19 09:59:22.981330869 +0000 UTC m=+872.270762507" observedRunningTime="2026-02-19 09:59:28.846385616 +0000 UTC m=+878.135817244" watchObservedRunningTime="2026-02-19 09:59:28.847445732 +0000 UTC m=+878.136877370" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.878471 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf" podStartSLOduration=3.635320858 podStartE2EDuration="21.878450136s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.901365717 +0000 UTC m=+858.190797355" lastFinishedPulling="2026-02-19 09:59:27.144494965 +0000 UTC m=+876.433926633" observedRunningTime="2026-02-19 09:59:28.873836482 +0000 UTC m=+878.163268120" watchObservedRunningTime="2026-02-19 09:59:28.878450136 +0000 UTC m=+878.167881774" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.903914 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl" podStartSLOduration=4.365301274 podStartE2EDuration="22.903895843s" podCreationTimestamp="2026-02-19 09:59:06 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.604466081 +0000 UTC m=+857.893897719" lastFinishedPulling="2026-02-19 09:59:27.14306061 +0000 UTC m=+876.432492288" observedRunningTime="2026-02-19 09:59:28.900380046 +0000 UTC m=+878.189811684" watchObservedRunningTime="2026-02-19 09:59:28.903895843 +0000 UTC m=+878.193327481" Feb 19 09:59:28 crc kubenswrapper[4873]: I0219 09:59:28.926360 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw" podStartSLOduration=8.722358359 podStartE2EDuration="22.926345106s" podCreationTimestamp="2026-02-19 09:59:06 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.776269755 +0000 UTC m=+858.065701383" lastFinishedPulling="2026-02-19 09:59:22.980256492 +0000 UTC m=+872.269688130" observedRunningTime="2026-02-19 09:59:28.92448355 +0000 UTC m=+878.213915188" watchObservedRunningTime="2026-02-19 09:59:28.926345106 +0000 UTC m=+878.215776744" Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.800085 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" event={"ID":"588098b3-662f-4f6f-914c-8cb28e055ccd","Type":"ContainerStarted","Data":"556f49e02480733ef35a0122aad086aa30f4e5328f5458b59f88dda31745f1f1"} Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.802072 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.804444 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" event={"ID":"e827e28d-ffd8-4f59-82bf-a6db1dab5413","Type":"ContainerStarted","Data":"129c2f452ff95128f6ec63cabadae31295f2c92340fc15d3521904f4b92c9407"} Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.805247 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.807162 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" event={"ID":"0e9da99c-56ee-4353-9378-c59a2c4e1608","Type":"ContainerStarted","Data":"af72e2e3aced14f4a4dad927f0dd599cac8ddbaec28f45701eb821856e9a3cad"} Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.807440 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.809611 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" event={"ID":"74e9952e-50ef-4389-aa77-8f6e9cc790a8","Type":"ContainerStarted","Data":"957454f5f698313a258239d295f2a02b75685f24992e9fde3611236a7684ae24"} Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.809917 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.811591 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" event={"ID":"e139553a-a68d-424d-95b5-9093ea05440b","Type":"ContainerStarted","Data":"aadf9f41334181b03820e6ee84b0ee433a46d40e95ded820cb0c23d5f04dd38c"} Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.812014 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.813609 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" event={"ID":"dc53742c-7e71-49fa-9378-b26036c80275","Type":"ContainerStarted","Data":"b0fb5a0bcbe2638cc1fae6ab7ea0d86be42f597f41f2ec6cdaed6ebfe6646ab0"} Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.814051 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.849469 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" podStartSLOduration=3.111922571 podStartE2EDuration="26.849453141s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:09.023958869 +0000 UTC m=+858.313390497" lastFinishedPulling="2026-02-19 09:59:32.761489389 +0000 UTC m=+882.050921067" observedRunningTime="2026-02-19 09:59:33.8453717 +0000 UTC m=+883.134803348" watchObservedRunningTime="2026-02-19 09:59:33.849453141 +0000 UTC m=+883.138884779" Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.850640 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" podStartSLOduration=3.03643035 podStartE2EDuration="26.85063572s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.947694309 +0000 UTC m=+858.237125947" lastFinishedPulling="2026-02-19 09:59:32.761899679 +0000 UTC m=+882.051331317" observedRunningTime="2026-02-19 09:59:33.831416516 +0000 UTC m=+883.120848154" watchObservedRunningTime="2026-02-19 09:59:33.85063572 +0000 UTC m=+883.140067358" Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.862778 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" podStartSLOduration=3.048748404 podStartE2EDuration="26.862762829s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.947398222 +0000 UTC m=+858.236829860" lastFinishedPulling="2026-02-19 09:59:32.761412617 +0000 UTC m=+882.050844285" observedRunningTime="2026-02-19 09:59:33.861522778 +0000 UTC m=+883.150954416" watchObservedRunningTime="2026-02-19 09:59:33.862762829 +0000 UTC m=+883.152194467" Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.882940 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" podStartSLOduration=3.006762219 podStartE2EDuration="26.882920645s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:09.021703253 +0000 UTC m=+858.311134891" lastFinishedPulling="2026-02-19 09:59:32.897861669 +0000 UTC m=+882.187293317" observedRunningTime="2026-02-19 09:59:33.87823689 +0000 UTC m=+883.167668528" watchObservedRunningTime="2026-02-19 09:59:33.882920645 +0000 UTC m=+883.172352283" Feb 19 09:59:33 crc kubenswrapper[4873]: I0219 09:59:33.894584 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" podStartSLOduration=3.120120502 podStartE2EDuration="26.894564242s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.982883846 +0000 UTC m=+858.272315484" lastFinishedPulling="2026-02-19 09:59:32.757327576 +0000 UTC m=+882.046759224" observedRunningTime="2026-02-19 09:59:33.892979913 +0000 UTC m=+883.182411561" watchObservedRunningTime="2026-02-19 09:59:33.894564242 +0000 UTC m=+883.183995890" Feb 19 09:59:34 crc kubenswrapper[4873]: I0219 09:59:34.510609 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" podStartSLOduration=3.779141884 podStartE2EDuration="27.510594374s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:09.026207574 +0000 UTC m=+858.315639212" lastFinishedPulling="2026-02-19 09:59:32.757660054 +0000 UTC m=+882.047091702" observedRunningTime="2026-02-19 09:59:33.928403376 +0000 UTC m=+883.217835024" watchObservedRunningTime="2026-02-19 09:59:34.510594374 +0000 UTC m=+883.800026012" Feb 19 09:59:34 crc kubenswrapper[4873]: I0219 09:59:34.825122 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" event={"ID":"e4172fa9-b04e-4894-82d6-ec65ea92b004","Type":"ContainerStarted","Data":"365db5b99786833a9aa194aeb1ee3d11c9cc32e014375f17139206bb6e74ec4d"} Feb 19 09:59:34 crc kubenswrapper[4873]: I0219 09:59:34.825492 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" Feb 19 09:59:34 crc kubenswrapper[4873]: I0219 09:59:34.850491 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" podStartSLOduration=2.24553966 podStartE2EDuration="27.85046534s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.770675027 +0000 UTC m=+858.060106665" lastFinishedPulling="2026-02-19 09:59:34.375600707 +0000 UTC m=+883.665032345" observedRunningTime="2026-02-19 09:59:34.840902924 +0000 UTC m=+884.130334562" watchObservedRunningTime="2026-02-19 09:59:34.85046534 +0000 UTC m=+884.139897018" Feb 19 09:59:35 crc kubenswrapper[4873]: I0219 09:59:35.834867 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" event={"ID":"8eec8859-f388-4d81-bbce-0433a66a1ef7","Type":"ContainerStarted","Data":"c3425d7eee071d6bac7344102c4233d900c4d7682b2503d76e8ff928df725026"} Feb 19 09:59:35 crc kubenswrapper[4873]: I0219 09:59:35.835231 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" Feb 19 09:59:35 crc kubenswrapper[4873]: I0219 09:59:35.865981 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" podStartSLOduration=2.777402185 podStartE2EDuration="28.865949324s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.782671742 +0000 UTC m=+858.072103380" lastFinishedPulling="2026-02-19 09:59:34.871218841 +0000 UTC m=+884.160650519" observedRunningTime="2026-02-19 09:59:35.858255715 +0000 UTC m=+885.147687393" watchObservedRunningTime="2026-02-19 09:59:35.865949324 +0000 UTC m=+885.155381002" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.388766 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-t54x9" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.398504 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-vgxsl" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.418047 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-vwx5n" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.436419 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-r9b5b" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.523895 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t7mwr" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.549747 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-f86jr" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.593257 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-hqmvw" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.594912 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-cx7xf" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.743234 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-t9kgf" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.796010 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-r74rt" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.851883 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" event={"ID":"c471d099-fa02-4463-9eb9-9d0f6a3832e6","Type":"ContainerStarted","Data":"8dda634260001ca8bb82be92714b9213968e96e0e99e9572a71e53f08462f313"} Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.852068 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" Feb 19 09:59:37 crc kubenswrapper[4873]: I0219 09:59:37.865087 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" podStartSLOduration=2.721383287 podStartE2EDuration="30.865075441s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:08.8016436 +0000 UTC m=+858.091075238" lastFinishedPulling="2026-02-19 09:59:36.945335754 +0000 UTC m=+886.234767392" observedRunningTime="2026-02-19 09:59:37.864808344 +0000 UTC m=+887.154239982" watchObservedRunningTime="2026-02-19 09:59:37.865075441 +0000 UTC m=+887.154507079" Feb 19 09:59:38 crc kubenswrapper[4873]: I0219 09:59:38.047723 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7d767c64df-hld6w" Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.095654 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.101856 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff0155f-08fd-42f5-9b31-c3b9a7cefefe-cert\") pod \"infra-operator-controller-manager-79d975b745-4t46s\" (UID: \"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.260719 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.298482 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.304700 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/515c6c0c-ae00-4ae1-ab3f-e22e5a585681-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv\" (UID: \"515c6c0c-ae00-4ae1-ab3f-e22e5a585681\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.563163 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.715939 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.716476 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.731771 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-metrics-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.734464 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/26f0a6ea-18fb-411a-b193-83938a4bbe19-webhook-certs\") pod \"openstack-operator-controller-manager-77c7c45f98-q8khx\" (UID: \"26f0a6ea-18fb-411a-b193-83938a4bbe19\") " pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.747465 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4t46s"] Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.872516 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" event={"ID":"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe","Type":"ContainerStarted","Data":"01c3a3adfdd88439bdfe6b1a13999b1d7fe0011c8f59f98ae2ef2dbe2e6e9998"} Feb 19 09:59:39 crc kubenswrapper[4873]: I0219 09:59:39.985469 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:40 crc kubenswrapper[4873]: I0219 09:59:40.062991 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv"] Feb 19 09:59:40 crc kubenswrapper[4873]: I0219 09:59:40.231334 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx"] Feb 19 09:59:40 crc kubenswrapper[4873]: W0219 09:59:40.239703 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26f0a6ea_18fb_411a_b193_83938a4bbe19.slice/crio-45ffdab2a1e2445dc1d551acb98a9598acc4a126115c29e85f2d4dc351ba9007 WatchSource:0}: Error finding container 45ffdab2a1e2445dc1d551acb98a9598acc4a126115c29e85f2d4dc351ba9007: Status 404 returned error can't find the container with id 45ffdab2a1e2445dc1d551acb98a9598acc4a126115c29e85f2d4dc351ba9007 Feb 19 09:59:40 crc kubenswrapper[4873]: I0219 09:59:40.882388 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" event={"ID":"26f0a6ea-18fb-411a-b193-83938a4bbe19","Type":"ContainerStarted","Data":"61063320a2d900665cae5b9138cfaf5c31fbb1ced795d32d4f69ab2c83cc0556"} Feb 19 09:59:40 crc kubenswrapper[4873]: I0219 09:59:40.882430 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" event={"ID":"26f0a6ea-18fb-411a-b193-83938a4bbe19","Type":"ContainerStarted","Data":"45ffdab2a1e2445dc1d551acb98a9598acc4a126115c29e85f2d4dc351ba9007"} Feb 19 09:59:40 crc kubenswrapper[4873]: I0219 09:59:40.882970 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 09:59:40 crc kubenswrapper[4873]: I0219 09:59:40.887663 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" event={"ID":"515c6c0c-ae00-4ae1-ab3f-e22e5a585681","Type":"ContainerStarted","Data":"bcfe01fe8056c96d6a78880051378c0dacda08484df5c7eb300694fe6fb6c7a0"} Feb 19 09:59:40 crc kubenswrapper[4873]: I0219 09:59:40.915249 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" podStartSLOduration=33.915227348 podStartE2EDuration="33.915227348s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:59:40.911231899 +0000 UTC m=+890.200709888" watchObservedRunningTime="2026-02-19 09:59:40.915227348 +0000 UTC m=+890.204658986" Feb 19 09:59:42 crc kubenswrapper[4873]: I0219 09:59:42.904263 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" event={"ID":"515c6c0c-ae00-4ae1-ab3f-e22e5a585681","Type":"ContainerStarted","Data":"39b5b23436213b86e36a6f189ff575a10fd54aa886bb7527359f39a1dcb974b2"} Feb 19 09:59:42 crc kubenswrapper[4873]: I0219 09:59:42.904715 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:42 crc kubenswrapper[4873]: I0219 09:59:42.906901 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" event={"ID":"3ff0155f-08fd-42f5-9b31-c3b9a7cefefe","Type":"ContainerStarted","Data":"abb4d278c8f5ca0e86bbac695d92d7f163399f825eddb0cd4b9050686dc342bd"} Feb 19 09:59:42 crc kubenswrapper[4873]: I0219 09:59:42.907115 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:42 crc kubenswrapper[4873]: I0219 09:59:42.910987 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4" event={"ID":"9574bff7-0aac-4a24-b69f-135ff968422e","Type":"ContainerStarted","Data":"5c9f43e3e91a76192aa79b3001b3fd64a0e6d6118f3862d95f7a3d8822800134"} Feb 19 09:59:42 crc kubenswrapper[4873]: I0219 09:59:42.938862 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" podStartSLOduration=33.960975806 podStartE2EDuration="35.938840228s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:40.083749017 +0000 UTC m=+889.373180655" lastFinishedPulling="2026-02-19 09:59:42.061613439 +0000 UTC m=+891.351045077" observedRunningTime="2026-02-19 09:59:42.9368942 +0000 UTC m=+892.226325878" watchObservedRunningTime="2026-02-19 09:59:42.938840228 +0000 UTC m=+892.228271866" Feb 19 09:59:42 crc kubenswrapper[4873]: I0219 09:59:42.959593 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" podStartSLOduration=33.660798227 podStartE2EDuration="35.959570678s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:39.755437346 +0000 UTC m=+889.044868984" lastFinishedPulling="2026-02-19 09:59:42.054209777 +0000 UTC m=+891.343641435" observedRunningTime="2026-02-19 09:59:42.956583524 +0000 UTC m=+892.246015202" watchObservedRunningTime="2026-02-19 09:59:42.959570678 +0000 UTC m=+892.249002356" Feb 19 09:59:42 crc kubenswrapper[4873]: I0219 09:59:42.988779 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lcnz4" podStartSLOduration=2.945286513 podStartE2EDuration="35.988757387s" podCreationTimestamp="2026-02-19 09:59:07 +0000 UTC" firstStartedPulling="2026-02-19 09:59:09.015362777 +0000 UTC m=+858.304794415" lastFinishedPulling="2026-02-19 09:59:42.058833621 +0000 UTC m=+891.348265289" observedRunningTime="2026-02-19 09:59:42.978020672 +0000 UTC m=+892.267452310" watchObservedRunningTime="2026-02-19 09:59:42.988757387 +0000 UTC m=+892.278189045" Feb 19 09:59:47 crc kubenswrapper[4873]: I0219 09:59:47.524435 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-t2hfl" Feb 19 09:59:47 crc kubenswrapper[4873]: I0219 09:59:47.563394 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-d6h72" Feb 19 09:59:47 crc kubenswrapper[4873]: I0219 09:59:47.644470 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8v7q6" Feb 19 09:59:47 crc kubenswrapper[4873]: I0219 09:59:47.665247 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-n6djt" Feb 19 09:59:47 crc kubenswrapper[4873]: I0219 09:59:47.757363 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-db4dr" Feb 19 09:59:47 crc kubenswrapper[4873]: I0219 09:59:47.797845 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-6hpwv" Feb 19 09:59:47 crc kubenswrapper[4873]: I0219 09:59:47.910200 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-g22tc" Feb 19 09:59:47 crc kubenswrapper[4873]: I0219 09:59:47.993209 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-2szzj" Feb 19 09:59:48 crc kubenswrapper[4873]: I0219 09:59:48.240363 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:59:48 crc kubenswrapper[4873]: I0219 09:59:48.240440 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:59:49 crc kubenswrapper[4873]: I0219 09:59:49.270211 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4t46s" Feb 19 09:59:49 crc kubenswrapper[4873]: I0219 09:59:49.570795 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv" Feb 19 09:59:49 crc kubenswrapper[4873]: I0219 09:59:49.992680 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-77c7c45f98-q8khx" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.187572 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm"] Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.189732 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.198559 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm"] Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.199885 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.200428 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.328521 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbkp7\" (UniqueName: \"kubernetes.io/projected/890a4af6-c400-4f2c-a387-edcbbc821b11-kube-api-access-kbkp7\") pod \"collect-profiles-29524920-796dm\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.328639 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/890a4af6-c400-4f2c-a387-edcbbc821b11-config-volume\") pod \"collect-profiles-29524920-796dm\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.328785 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/890a4af6-c400-4f2c-a387-edcbbc821b11-secret-volume\") pod \"collect-profiles-29524920-796dm\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.430208 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/890a4af6-c400-4f2c-a387-edcbbc821b11-config-volume\") pod \"collect-profiles-29524920-796dm\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.430366 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/890a4af6-c400-4f2c-a387-edcbbc821b11-secret-volume\") pod \"collect-profiles-29524920-796dm\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.430464 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbkp7\" (UniqueName: \"kubernetes.io/projected/890a4af6-c400-4f2c-a387-edcbbc821b11-kube-api-access-kbkp7\") pod \"collect-profiles-29524920-796dm\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.432530 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/890a4af6-c400-4f2c-a387-edcbbc821b11-config-volume\") pod \"collect-profiles-29524920-796dm\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.442684 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/890a4af6-c400-4f2c-a387-edcbbc821b11-secret-volume\") pod \"collect-profiles-29524920-796dm\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.451541 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbkp7\" (UniqueName: \"kubernetes.io/projected/890a4af6-c400-4f2c-a387-edcbbc821b11-kube-api-access-kbkp7\") pod \"collect-profiles-29524920-796dm\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.519458 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:00 crc kubenswrapper[4873]: I0219 10:00:00.978640 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm"] Feb 19 10:00:00 crc kubenswrapper[4873]: W0219 10:00:00.991012 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod890a4af6_c400_4f2c_a387_edcbbc821b11.slice/crio-c82b10b78a2df67bafea6ed3d2a04f1a658d9480b1b665d50562afe3fc6cc091 WatchSource:0}: Error finding container c82b10b78a2df67bafea6ed3d2a04f1a658d9480b1b665d50562afe3fc6cc091: Status 404 returned error can't find the container with id c82b10b78a2df67bafea6ed3d2a04f1a658d9480b1b665d50562afe3fc6cc091 Feb 19 10:00:01 crc kubenswrapper[4873]: I0219 10:00:01.062128 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" event={"ID":"890a4af6-c400-4f2c-a387-edcbbc821b11","Type":"ContainerStarted","Data":"c82b10b78a2df67bafea6ed3d2a04f1a658d9480b1b665d50562afe3fc6cc091"} Feb 19 10:00:02 crc kubenswrapper[4873]: I0219 10:00:02.071211 4873 generic.go:334] "Generic (PLEG): container finished" podID="890a4af6-c400-4f2c-a387-edcbbc821b11" containerID="2ea87556ea1e2777f378238131c83ccd55a7eac5410c13097afbd46ee33f0929" exitCode=0 Feb 19 10:00:02 crc kubenswrapper[4873]: I0219 10:00:02.071773 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" event={"ID":"890a4af6-c400-4f2c-a387-edcbbc821b11","Type":"ContainerDied","Data":"2ea87556ea1e2777f378238131c83ccd55a7eac5410c13097afbd46ee33f0929"} Feb 19 10:00:03 crc kubenswrapper[4873]: I0219 10:00:03.360437 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:03 crc kubenswrapper[4873]: I0219 10:00:03.478509 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/890a4af6-c400-4f2c-a387-edcbbc821b11-config-volume\") pod \"890a4af6-c400-4f2c-a387-edcbbc821b11\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " Feb 19 10:00:03 crc kubenswrapper[4873]: I0219 10:00:03.478563 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbkp7\" (UniqueName: \"kubernetes.io/projected/890a4af6-c400-4f2c-a387-edcbbc821b11-kube-api-access-kbkp7\") pod \"890a4af6-c400-4f2c-a387-edcbbc821b11\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " Feb 19 10:00:03 crc kubenswrapper[4873]: I0219 10:00:03.478630 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/890a4af6-c400-4f2c-a387-edcbbc821b11-secret-volume\") pod \"890a4af6-c400-4f2c-a387-edcbbc821b11\" (UID: \"890a4af6-c400-4f2c-a387-edcbbc821b11\") " Feb 19 10:00:03 crc kubenswrapper[4873]: I0219 10:00:03.479307 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890a4af6-c400-4f2c-a387-edcbbc821b11-config-volume" (OuterVolumeSpecName: "config-volume") pod "890a4af6-c400-4f2c-a387-edcbbc821b11" (UID: "890a4af6-c400-4f2c-a387-edcbbc821b11"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:00:03 crc kubenswrapper[4873]: I0219 10:00:03.490468 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890a4af6-c400-4f2c-a387-edcbbc821b11-kube-api-access-kbkp7" (OuterVolumeSpecName: "kube-api-access-kbkp7") pod "890a4af6-c400-4f2c-a387-edcbbc821b11" (UID: "890a4af6-c400-4f2c-a387-edcbbc821b11"). InnerVolumeSpecName "kube-api-access-kbkp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:00:03 crc kubenswrapper[4873]: I0219 10:00:03.495887 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890a4af6-c400-4f2c-a387-edcbbc821b11-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "890a4af6-c400-4f2c-a387-edcbbc821b11" (UID: "890a4af6-c400-4f2c-a387-edcbbc821b11"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:00:03 crc kubenswrapper[4873]: I0219 10:00:03.579881 4873 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/890a4af6-c400-4f2c-a387-edcbbc821b11-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:03 crc kubenswrapper[4873]: I0219 10:00:03.579926 4873 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/890a4af6-c400-4f2c-a387-edcbbc821b11-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:03 crc kubenswrapper[4873]: I0219 10:00:03.580003 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbkp7\" (UniqueName: \"kubernetes.io/projected/890a4af6-c400-4f2c-a387-edcbbc821b11-kube-api-access-kbkp7\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:04 crc kubenswrapper[4873]: I0219 10:00:04.088581 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" event={"ID":"890a4af6-c400-4f2c-a387-edcbbc821b11","Type":"ContainerDied","Data":"c82b10b78a2df67bafea6ed3d2a04f1a658d9480b1b665d50562afe3fc6cc091"} Feb 19 10:00:04 crc kubenswrapper[4873]: I0219 10:00:04.088880 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c82b10b78a2df67bafea6ed3d2a04f1a658d9480b1b665d50562afe3fc6cc091" Feb 19 10:00:04 crc kubenswrapper[4873]: I0219 10:00:04.088832 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.425596 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57b9d58665-gfr42"] Feb 19 10:00:08 crc kubenswrapper[4873]: E0219 10:00:08.426505 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890a4af6-c400-4f2c-a387-edcbbc821b11" containerName="collect-profiles" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.426517 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="890a4af6-c400-4f2c-a387-edcbbc821b11" containerName="collect-profiles" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.426669 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="890a4af6-c400-4f2c-a387-edcbbc821b11" containerName="collect-profiles" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.427495 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b9d58665-gfr42" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.431619 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.431975 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.432226 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8n6zj" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.432493 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.442817 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b9d58665-gfr42"] Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.481072 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bb9bf987-bjckx"] Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.482181 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.484169 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.493720 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb9bf987-bjckx"] Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.550869 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnswl\" (UniqueName: \"kubernetes.io/projected/b3a77d5e-b932-466f-a391-983ffef7c5ae-kube-api-access-nnswl\") pod \"dnsmasq-dns-57b9d58665-gfr42\" (UID: \"b3a77d5e-b932-466f-a391-983ffef7c5ae\") " pod="openstack/dnsmasq-dns-57b9d58665-gfr42" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.550925 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-dns-svc\") pod \"dnsmasq-dns-7bb9bf987-bjckx\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.550957 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a77d5e-b932-466f-a391-983ffef7c5ae-config\") pod \"dnsmasq-dns-57b9d58665-gfr42\" (UID: \"b3a77d5e-b932-466f-a391-983ffef7c5ae\") " pod="openstack/dnsmasq-dns-57b9d58665-gfr42" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.550985 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4ddx\" (UniqueName: \"kubernetes.io/projected/bea1b96b-f9da-4733-a537-a536ec66edc0-kube-api-access-r4ddx\") pod \"dnsmasq-dns-7bb9bf987-bjckx\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.551017 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-config\") pod \"dnsmasq-dns-7bb9bf987-bjckx\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.652042 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnswl\" (UniqueName: \"kubernetes.io/projected/b3a77d5e-b932-466f-a391-983ffef7c5ae-kube-api-access-nnswl\") pod \"dnsmasq-dns-57b9d58665-gfr42\" (UID: \"b3a77d5e-b932-466f-a391-983ffef7c5ae\") " pod="openstack/dnsmasq-dns-57b9d58665-gfr42" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.652324 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-dns-svc\") pod \"dnsmasq-dns-7bb9bf987-bjckx\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.652411 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a77d5e-b932-466f-a391-983ffef7c5ae-config\") pod \"dnsmasq-dns-57b9d58665-gfr42\" (UID: \"b3a77d5e-b932-466f-a391-983ffef7c5ae\") " pod="openstack/dnsmasq-dns-57b9d58665-gfr42" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.652488 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4ddx\" (UniqueName: \"kubernetes.io/projected/bea1b96b-f9da-4733-a537-a536ec66edc0-kube-api-access-r4ddx\") pod \"dnsmasq-dns-7bb9bf987-bjckx\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.652568 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-config\") pod \"dnsmasq-dns-7bb9bf987-bjckx\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.653254 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-dns-svc\") pod \"dnsmasq-dns-7bb9bf987-bjckx\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.653344 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-config\") pod \"dnsmasq-dns-7bb9bf987-bjckx\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.653410 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a77d5e-b932-466f-a391-983ffef7c5ae-config\") pod \"dnsmasq-dns-57b9d58665-gfr42\" (UID: \"b3a77d5e-b932-466f-a391-983ffef7c5ae\") " pod="openstack/dnsmasq-dns-57b9d58665-gfr42" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.673067 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4ddx\" (UniqueName: \"kubernetes.io/projected/bea1b96b-f9da-4733-a537-a536ec66edc0-kube-api-access-r4ddx\") pod \"dnsmasq-dns-7bb9bf987-bjckx\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.673094 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnswl\" (UniqueName: \"kubernetes.io/projected/b3a77d5e-b932-466f-a391-983ffef7c5ae-kube-api-access-nnswl\") pod \"dnsmasq-dns-57b9d58665-gfr42\" (UID: \"b3a77d5e-b932-466f-a391-983ffef7c5ae\") " pod="openstack/dnsmasq-dns-57b9d58665-gfr42" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.758699 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b9d58665-gfr42" Feb 19 10:00:08 crc kubenswrapper[4873]: I0219 10:00:08.798172 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:09 crc kubenswrapper[4873]: I0219 10:00:09.109822 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b9d58665-gfr42"] Feb 19 10:00:09 crc kubenswrapper[4873]: I0219 10:00:09.145252 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b9d58665-gfr42" event={"ID":"b3a77d5e-b932-466f-a391-983ffef7c5ae","Type":"ContainerStarted","Data":"909aed776126287df1a7798864d3d0881f670c7df611ac6ef1496c9f130ee423"} Feb 19 10:00:09 crc kubenswrapper[4873]: I0219 10:00:09.405473 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb9bf987-bjckx"] Feb 19 10:00:09 crc kubenswrapper[4873]: W0219 10:00:09.413380 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbea1b96b_f9da_4733_a537_a536ec66edc0.slice/crio-68a3e971085bdee9176e4f2a37706001ac7c9cd7629296c14068391c6eb2512d WatchSource:0}: Error finding container 68a3e971085bdee9176e4f2a37706001ac7c9cd7629296c14068391c6eb2512d: Status 404 returned error can't find the container with id 68a3e971085bdee9176e4f2a37706001ac7c9cd7629296c14068391c6eb2512d Feb 19 10:00:10 crc kubenswrapper[4873]: I0219 10:00:10.153274 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" event={"ID":"bea1b96b-f9da-4733-a537-a536ec66edc0","Type":"ContainerStarted","Data":"68a3e971085bdee9176e4f2a37706001ac7c9cd7629296c14068391c6eb2512d"} Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.091065 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b9d58665-gfr42"] Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.118775 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7569d6d65f-54dks"] Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.119920 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.138388 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7569d6d65f-54dks"] Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.312306 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-dns-svc\") pod \"dnsmasq-dns-7569d6d65f-54dks\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.312568 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-config\") pod \"dnsmasq-dns-7569d6d65f-54dks\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.312744 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pcf5\" (UniqueName: \"kubernetes.io/projected/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-kube-api-access-6pcf5\") pod \"dnsmasq-dns-7569d6d65f-54dks\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.414466 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-dns-svc\") pod \"dnsmasq-dns-7569d6d65f-54dks\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.414540 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-config\") pod \"dnsmasq-dns-7569d6d65f-54dks\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.414575 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pcf5\" (UniqueName: \"kubernetes.io/projected/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-kube-api-access-6pcf5\") pod \"dnsmasq-dns-7569d6d65f-54dks\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.415530 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-config\") pod \"dnsmasq-dns-7569d6d65f-54dks\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.415664 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-dns-svc\") pod \"dnsmasq-dns-7569d6d65f-54dks\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.424629 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bb9bf987-bjckx"] Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.441511 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pcf5\" (UniqueName: \"kubernetes.io/projected/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-kube-api-access-6pcf5\") pod \"dnsmasq-dns-7569d6d65f-54dks\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.467061 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-684f645dc-zkgql"] Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.468178 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.505940 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-684f645dc-zkgql"] Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.515304 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-dns-svc\") pod \"dnsmasq-dns-684f645dc-zkgql\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.515371 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-config\") pod \"dnsmasq-dns-684f645dc-zkgql\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.515417 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlb64\" (UniqueName: \"kubernetes.io/projected/16ae739e-2542-4b44-820b-e08570c825dc-kube-api-access-jlb64\") pod \"dnsmasq-dns-684f645dc-zkgql\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.616750 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-dns-svc\") pod \"dnsmasq-dns-684f645dc-zkgql\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.616801 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-config\") pod \"dnsmasq-dns-684f645dc-zkgql\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.616849 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlb64\" (UniqueName: \"kubernetes.io/projected/16ae739e-2542-4b44-820b-e08570c825dc-kube-api-access-jlb64\") pod \"dnsmasq-dns-684f645dc-zkgql\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.617626 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-dns-svc\") pod \"dnsmasq-dns-684f645dc-zkgql\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.617698 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-config\") pod \"dnsmasq-dns-684f645dc-zkgql\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.642888 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlb64\" (UniqueName: \"kubernetes.io/projected/16ae739e-2542-4b44-820b-e08570c825dc-kube-api-access-jlb64\") pod \"dnsmasq-dns-684f645dc-zkgql\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.736623 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.782381 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.830393 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-684f645dc-zkgql"] Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.849510 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58ff7f48c5-nqbz4"] Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.850705 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.860185 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58ff7f48c5-nqbz4"] Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.920593 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-dns-svc\") pod \"dnsmasq-dns-58ff7f48c5-nqbz4\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.920651 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5m2m\" (UniqueName: \"kubernetes.io/projected/d736e93a-6a36-458e-a8f4-a9d511530043-kube-api-access-s5m2m\") pod \"dnsmasq-dns-58ff7f48c5-nqbz4\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:12 crc kubenswrapper[4873]: I0219 10:00:12.920752 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-config\") pod \"dnsmasq-dns-58ff7f48c5-nqbz4\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.021654 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-dns-svc\") pod \"dnsmasq-dns-58ff7f48c5-nqbz4\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.023852 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5m2m\" (UniqueName: \"kubernetes.io/projected/d736e93a-6a36-458e-a8f4-a9d511530043-kube-api-access-s5m2m\") pod \"dnsmasq-dns-58ff7f48c5-nqbz4\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.024171 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-config\") pod \"dnsmasq-dns-58ff7f48c5-nqbz4\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.022491 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-dns-svc\") pod \"dnsmasq-dns-58ff7f48c5-nqbz4\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.024950 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-config\") pod \"dnsmasq-dns-58ff7f48c5-nqbz4\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.040005 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5m2m\" (UniqueName: \"kubernetes.io/projected/d736e93a-6a36-458e-a8f4-a9d511530043-kube-api-access-s5m2m\") pod \"dnsmasq-dns-58ff7f48c5-nqbz4\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.174926 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.295211 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.296666 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.299465 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-default-user" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.299612 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-erlang-cookie" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.299728 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-config-data" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.299819 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-plugins-conf" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.299993 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-server-conf" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.300217 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-notifications-rabbitmq-svc" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.300355 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-server-dockercfg-4gb2b" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332336 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332376 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332413 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332450 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332488 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332586 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332631 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332695 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332717 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332760 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332784 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.332806 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfzqn\" (UniqueName: \"kubernetes.io/projected/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-kube-api-access-gfzqn\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.434020 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.434412 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.434447 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.434478 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.434492 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.434515 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.434533 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.434551 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfzqn\" (UniqueName: \"kubernetes.io/projected/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-kube-api-access-gfzqn\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.434592 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.434617 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.434633 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.435489 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.437659 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.438221 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.438368 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.441810 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.442359 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.446532 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.447042 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.455804 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.456416 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.458499 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfzqn\" (UniqueName: \"kubernetes.io/projected/da89f0ff-c51c-4c4a-8df4-f7787d29ddd2-kube-api-access-gfzqn\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.468474 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2\") " pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.634179 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.644721 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.646239 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.653578 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.653655 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.653676 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.653698 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.653999 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.654252 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-fnhrw" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.654388 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.667337 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.737725 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.737780 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.737845 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjp76\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-kube-api-access-vjp76\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.737866 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.738001 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-config-data\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.738028 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.738051 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.738068 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9251ac9a-275e-4622-83a2-121d59ec8cd1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.738180 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.738309 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.738370 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9251ac9a-275e-4622-83a2-121d59ec8cd1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.840426 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-config-data\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.840462 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.840480 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.840497 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9251ac9a-275e-4622-83a2-121d59ec8cd1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.840519 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.840555 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.840578 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9251ac9a-275e-4622-83a2-121d59ec8cd1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.840597 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.840621 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.840650 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjp76\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-kube-api-access-vjp76\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.840669 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.841068 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.841843 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-config-data\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.842503 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.842701 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.843456 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.844242 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.847248 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9251ac9a-275e-4622-83a2-121d59ec8cd1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.847904 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.847966 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9251ac9a-275e-4622-83a2-121d59ec8cd1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.849353 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.869157 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.874630 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjp76\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-kube-api-access-vjp76\") pod \"rabbitmq-server-0\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.975450 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.978349 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.979526 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.985570 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6k7rl" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.987653 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.987821 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.987945 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.988042 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.988168 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.988280 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 10:00:13 crc kubenswrapper[4873]: I0219 10:00:13.996761 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.145811 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.145853 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/86685946-19ac-434a-974f-99b5beeda172-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.145895 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.145914 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.145933 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.146016 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m275n\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-kube-api-access-m275n\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.146039 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.146131 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.146169 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/86685946-19ac-434a-974f-99b5beeda172-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.146206 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.146326 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.248065 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m275n\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-kube-api-access-m275n\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.248167 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.248231 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.248264 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/86685946-19ac-434a-974f-99b5beeda172-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.248316 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.248347 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.248398 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.248422 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/86685946-19ac-434a-974f-99b5beeda172-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.248466 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.248487 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.248513 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.249362 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.249712 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.249978 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.250137 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.250278 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.250646 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.253394 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.270169 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/86685946-19ac-434a-974f-99b5beeda172-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.272841 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.272935 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/86685946-19ac-434a-974f-99b5beeda172-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.273521 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.290188 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m275n\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-kube-api-access-m275n\") pod \"rabbitmq-cell1-server-0\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:14 crc kubenswrapper[4873]: I0219 10:00:14.324980 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.431997 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.436609 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.443745 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.446072 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.446178 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.446625 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.446689 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-vvcp2" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.454921 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.568148 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.568230 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q2kz\" (UniqueName: \"kubernetes.io/projected/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-kube-api-access-8q2kz\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.568262 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.568346 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-kolla-config\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.568384 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.568647 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-config-data-default\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.568737 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.568777 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.670061 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q2kz\" (UniqueName: \"kubernetes.io/projected/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-kube-api-access-8q2kz\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.670190 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.670254 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-kolla-config\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.670276 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.670321 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-config-data-default\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.670343 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.670365 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.670426 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.670878 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.671447 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.671738 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-kolla-config\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.671789 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-config-data-default\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.680547 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.681988 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.682907 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.695500 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.700790 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q2kz\" (UniqueName: \"kubernetes.io/projected/f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964-kube-api-access-8q2kz\") pod \"openstack-galera-0\" (UID: \"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964\") " pod="openstack/openstack-galera-0" Feb 19 10:00:15 crc kubenswrapper[4873]: I0219 10:00:15.758988 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.809501 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.810823 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.813200 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.813289 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.815533 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.821809 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xfxn6" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.828747 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.888247 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e3385c22-baa0-4261-b498-6a09c8768520-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.888310 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3385c22-baa0-4261-b498-6a09c8768520-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.888343 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3385c22-baa0-4261-b498-6a09c8768520-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.888365 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3385c22-baa0-4261-b498-6a09c8768520-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.888425 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3385c22-baa0-4261-b498-6a09c8768520-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.888473 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e3385c22-baa0-4261-b498-6a09c8768520-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.888501 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.888610 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwswv\" (UniqueName: \"kubernetes.io/projected/e3385c22-baa0-4261-b498-6a09c8768520-kube-api-access-qwswv\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.899083 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.900057 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.902016 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.902255 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.903220 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-d42bp" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.916683 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.989929 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e3385c22-baa0-4261-b498-6a09c8768520-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.989975 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21bb5d7d-6565-484a-af2d-0edcff2729b3-kolla-config\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990017 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990056 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwswv\" (UniqueName: \"kubernetes.io/projected/e3385c22-baa0-4261-b498-6a09c8768520-kube-api-access-qwswv\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990091 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e3385c22-baa0-4261-b498-6a09c8768520-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990207 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3385c22-baa0-4261-b498-6a09c8768520-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990242 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21bb5d7d-6565-484a-af2d-0edcff2729b3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990274 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3385c22-baa0-4261-b498-6a09c8768520-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990302 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21bb5d7d-6565-484a-af2d-0edcff2729b3-config-data\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990275 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990327 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3385c22-baa0-4261-b498-6a09c8768520-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990453 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e3385c22-baa0-4261-b498-6a09c8768520-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990474 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/21bb5d7d-6565-484a-af2d-0edcff2729b3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990518 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3385c22-baa0-4261-b498-6a09c8768520-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.990552 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9nlz\" (UniqueName: \"kubernetes.io/projected/21bb5d7d-6565-484a-af2d-0edcff2729b3-kube-api-access-l9nlz\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.991017 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3385c22-baa0-4261-b498-6a09c8768520-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.991051 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e3385c22-baa0-4261-b498-6a09c8768520-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.991945 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3385c22-baa0-4261-b498-6a09c8768520-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:16 crc kubenswrapper[4873]: I0219 10:00:16.995858 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3385c22-baa0-4261-b498-6a09c8768520-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.005545 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3385c22-baa0-4261-b498-6a09c8768520-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.008841 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwswv\" (UniqueName: \"kubernetes.io/projected/e3385c22-baa0-4261-b498-6a09c8768520-kube-api-access-qwswv\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.013217 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3385c22-baa0-4261-b498-6a09c8768520\") " pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.091796 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21bb5d7d-6565-484a-af2d-0edcff2729b3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.091849 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21bb5d7d-6565-484a-af2d-0edcff2729b3-config-data\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.091874 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/21bb5d7d-6565-484a-af2d-0edcff2729b3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.091907 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9nlz\" (UniqueName: \"kubernetes.io/projected/21bb5d7d-6565-484a-af2d-0edcff2729b3-kube-api-access-l9nlz\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.091944 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21bb5d7d-6565-484a-af2d-0edcff2729b3-kolla-config\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.092716 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21bb5d7d-6565-484a-af2d-0edcff2729b3-config-data\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.092759 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21bb5d7d-6565-484a-af2d-0edcff2729b3-kolla-config\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.097527 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21bb5d7d-6565-484a-af2d-0edcff2729b3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.098561 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/21bb5d7d-6565-484a-af2d-0edcff2729b3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.109234 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9nlz\" (UniqueName: \"kubernetes.io/projected/21bb5d7d-6565-484a-af2d-0edcff2729b3-kube-api-access-l9nlz\") pod \"memcached-0\" (UID: \"21bb5d7d-6565-484a-af2d-0edcff2729b3\") " pod="openstack/memcached-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.129228 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 10:00:17 crc kubenswrapper[4873]: I0219 10:00:17.213519 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 10:00:18 crc kubenswrapper[4873]: I0219 10:00:18.240258 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:00:18 crc kubenswrapper[4873]: I0219 10:00:18.240344 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:00:19 crc kubenswrapper[4873]: I0219 10:00:19.253984 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 10:00:19 crc kubenswrapper[4873]: I0219 10:00:19.254915 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 10:00:19 crc kubenswrapper[4873]: I0219 10:00:19.258395 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-l2t29" Feb 19 10:00:19 crc kubenswrapper[4873]: I0219 10:00:19.267643 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 10:00:19 crc kubenswrapper[4873]: I0219 10:00:19.337096 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fb6h\" (UniqueName: \"kubernetes.io/projected/5224ec80-b354-467f-b660-2d22b9725be0-kube-api-access-7fb6h\") pod \"kube-state-metrics-0\" (UID: \"5224ec80-b354-467f-b660-2d22b9725be0\") " pod="openstack/kube-state-metrics-0" Feb 19 10:00:19 crc kubenswrapper[4873]: I0219 10:00:19.439415 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fb6h\" (UniqueName: \"kubernetes.io/projected/5224ec80-b354-467f-b660-2d22b9725be0-kube-api-access-7fb6h\") pod \"kube-state-metrics-0\" (UID: \"5224ec80-b354-467f-b660-2d22b9725be0\") " pod="openstack/kube-state-metrics-0" Feb 19 10:00:19 crc kubenswrapper[4873]: I0219 10:00:19.463973 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fb6h\" (UniqueName: \"kubernetes.io/projected/5224ec80-b354-467f-b660-2d22b9725be0-kube-api-access-7fb6h\") pod \"kube-state-metrics-0\" (UID: \"5224ec80-b354-467f-b660-2d22b9725be0\") " pod="openstack/kube-state-metrics-0" Feb 19 10:00:19 crc kubenswrapper[4873]: I0219 10:00:19.570440 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.488906 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.496024 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.504079 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.510809 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.511026 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.522480 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.522630 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.522887 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.522998 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.527740 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.528437 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-stpz9" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.562838 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.563358 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.563436 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz72n\" (UniqueName: \"kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-kube-api-access-hz72n\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.563477 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.563697 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.563781 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-config\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.563824 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.563967 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.564163 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.564233 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b46b116-4858-4b6a-b3ad-9337272f9a91-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.666070 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.666163 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.666192 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-config\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.666217 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.666255 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.666287 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.666309 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b46b116-4858-4b6a-b3ad-9337272f9a91-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.666358 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.666381 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.666421 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz72n\" (UniqueName: \"kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-kube-api-access-hz72n\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.667745 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.669203 4873 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.669251 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/668a04d4437b4137f130ddea3fc0a68c22db655664b336b39ceb124bf62a44ab/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.669465 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.669548 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.672771 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b46b116-4858-4b6a-b3ad-9337272f9a91-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.673338 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.674684 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.679252 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-config\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.683287 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz72n\" (UniqueName: \"kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-kube-api-access-hz72n\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.704758 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.708982 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.863367 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 10:00:20 crc kubenswrapper[4873]: I0219 10:00:20.912076 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58ff7f48c5-nqbz4"] Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.441950 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vsnt5"] Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.443442 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.445891 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.446786 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-l2dvx" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.446964 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.457272 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vsnt5"] Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.478124 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-t5bgp"] Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.480325 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.485765 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-t5bgp"] Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.494441 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6v9q\" (UniqueName: \"kubernetes.io/projected/b0ab9d21-0c11-4940-ad43-3e20c46012ad-kube-api-access-f6v9q\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.494497 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0ab9d21-0c11-4940-ad43-3e20c46012ad-scripts\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.494516 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ab9d21-0c11-4940-ad43-3e20c46012ad-ovn-controller-tls-certs\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.494538 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ab9d21-0c11-4940-ad43-3e20c46012ad-combined-ca-bundle\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.494556 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0ab9d21-0c11-4940-ad43-3e20c46012ad-var-run\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.494573 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ab9d21-0c11-4940-ad43-3e20c46012ad-var-log-ovn\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.494603 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ab9d21-0c11-4940-ad43-3e20c46012ad-var-run-ovn\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.596503 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6v9q\" (UniqueName: \"kubernetes.io/projected/b0ab9d21-0c11-4940-ad43-3e20c46012ad-kube-api-access-f6v9q\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.596566 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2f2331-fc83-420b-9e1b-fe08998cb0ab-scripts\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.596608 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0ab9d21-0c11-4940-ad43-3e20c46012ad-scripts\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.596681 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ab9d21-0c11-4940-ad43-3e20c46012ad-ovn-controller-tls-certs\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.596709 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ab9d21-0c11-4940-ad43-3e20c46012ad-combined-ca-bundle\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.597125 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0ab9d21-0c11-4940-ad43-3e20c46012ad-var-run\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.597161 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ab9d21-0c11-4940-ad43-3e20c46012ad-var-log-ovn\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.597186 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-var-run\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.597204 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6kzd\" (UniqueName: \"kubernetes.io/projected/de2f2331-fc83-420b-9e1b-fe08998cb0ab-kube-api-access-w6kzd\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.597229 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ab9d21-0c11-4940-ad43-3e20c46012ad-var-run-ovn\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.598429 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ab9d21-0c11-4940-ad43-3e20c46012ad-var-log-ovn\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.599617 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0ab9d21-0c11-4940-ad43-3e20c46012ad-scripts\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.600027 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0ab9d21-0c11-4940-ad43-3e20c46012ad-var-run-ovn\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.600064 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-etc-ovs\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.600381 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-var-log\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.600535 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-var-lib\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.600212 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0ab9d21-0c11-4940-ad43-3e20c46012ad-var-run\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.604672 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ab9d21-0c11-4940-ad43-3e20c46012ad-combined-ca-bundle\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.608926 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ab9d21-0c11-4940-ad43-3e20c46012ad-ovn-controller-tls-certs\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.623406 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6v9q\" (UniqueName: \"kubernetes.io/projected/b0ab9d21-0c11-4940-ad43-3e20c46012ad-kube-api-access-f6v9q\") pod \"ovn-controller-vsnt5\" (UID: \"b0ab9d21-0c11-4940-ad43-3e20c46012ad\") " pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.701783 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-var-run\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.701852 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6kzd\" (UniqueName: \"kubernetes.io/projected/de2f2331-fc83-420b-9e1b-fe08998cb0ab-kube-api-access-w6kzd\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.701896 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-etc-ovs\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.702155 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-var-run\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.702443 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-var-log\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.702518 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-var-lib\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.702564 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2f2331-fc83-420b-9e1b-fe08998cb0ab-scripts\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.703733 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-var-log\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.703933 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-var-lib\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.704080 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/de2f2331-fc83-420b-9e1b-fe08998cb0ab-etc-ovs\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.711278 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2f2331-fc83-420b-9e1b-fe08998cb0ab-scripts\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.716548 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6kzd\" (UniqueName: \"kubernetes.io/projected/de2f2331-fc83-420b-9e1b-fe08998cb0ab-kube-api-access-w6kzd\") pod \"ovn-controller-ovs-t5bgp\" (UID: \"de2f2331-fc83-420b-9e1b-fe08998cb0ab\") " pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.768531 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:22 crc kubenswrapper[4873]: I0219 10:00:22.796521 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.318123 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xw7xl"] Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.326439 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.388166 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xw7xl"] Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.435980 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-catalog-content\") pod \"redhat-marketplace-xw7xl\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.436083 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-utilities\") pod \"redhat-marketplace-xw7xl\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.436257 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84czc\" (UniqueName: \"kubernetes.io/projected/8568b0bc-e3d1-4e4e-8172-bada186b750a-kube-api-access-84czc\") pod \"redhat-marketplace-xw7xl\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.537221 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84czc\" (UniqueName: \"kubernetes.io/projected/8568b0bc-e3d1-4e4e-8172-bada186b750a-kube-api-access-84czc\") pod \"redhat-marketplace-xw7xl\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.537299 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-catalog-content\") pod \"redhat-marketplace-xw7xl\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.537349 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-utilities\") pod \"redhat-marketplace-xw7xl\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.537945 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-utilities\") pod \"redhat-marketplace-xw7xl\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.538123 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-catalog-content\") pod \"redhat-marketplace-xw7xl\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.567366 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84czc\" (UniqueName: \"kubernetes.io/projected/8568b0bc-e3d1-4e4e-8172-bada186b750a-kube-api-access-84czc\") pod \"redhat-marketplace-xw7xl\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:24 crc kubenswrapper[4873]: I0219 10:00:24.688860 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:26 crc kubenswrapper[4873]: W0219 10:00:26.021496 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd736e93a_6a36_458e_a8f4_a9d511530043.slice/crio-9fb66c25e5ae993349fe01e14d2ddec63fb2fa9c9f8fdca8f66724a42a087dcf WatchSource:0}: Error finding container 9fb66c25e5ae993349fe01e14d2ddec63fb2fa9c9f8fdca8f66724a42a087dcf: Status 404 returned error can't find the container with id 9fb66c25e5ae993349fe01e14d2ddec63fb2fa9c9f8fdca8f66724a42a087dcf Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.278802 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" event={"ID":"d736e93a-6a36-458e-a8f4-a9d511530043","Type":"ContainerStarted","Data":"9fb66c25e5ae993349fe01e14d2ddec63fb2fa9c9f8fdca8f66724a42a087dcf"} Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.316713 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.319785 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.324368 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.324449 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.324556 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.324704 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.324862 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8f5jd" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.332082 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.369145 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4574f6e3-d697-424c-a9f1-7b74afb82324-config\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.369481 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4574f6e3-d697-424c-a9f1-7b74afb82324-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.369548 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qmv\" (UniqueName: \"kubernetes.io/projected/4574f6e3-d697-424c-a9f1-7b74afb82324-kube-api-access-z7qmv\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.369719 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4574f6e3-d697-424c-a9f1-7b74afb82324-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.369768 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4574f6e3-d697-424c-a9f1-7b74afb82324-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.369951 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4574f6e3-d697-424c-a9f1-7b74afb82324-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.370071 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4574f6e3-d697-424c-a9f1-7b74afb82324-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.370234 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.467354 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.473032 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qmv\" (UniqueName: \"kubernetes.io/projected/4574f6e3-d697-424c-a9f1-7b74afb82324-kube-api-access-z7qmv\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.473088 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4574f6e3-d697-424c-a9f1-7b74afb82324-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.473122 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4574f6e3-d697-424c-a9f1-7b74afb82324-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.473161 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4574f6e3-d697-424c-a9f1-7b74afb82324-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.473199 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4574f6e3-d697-424c-a9f1-7b74afb82324-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.473220 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.473245 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4574f6e3-d697-424c-a9f1-7b74afb82324-config\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.473265 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4574f6e3-d697-424c-a9f1-7b74afb82324-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.473903 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4574f6e3-d697-424c-a9f1-7b74afb82324-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.474507 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4574f6e3-d697-424c-a9f1-7b74afb82324-config\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.474684 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4574f6e3-d697-424c-a9f1-7b74afb82324-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.474945 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.486765 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4574f6e3-d697-424c-a9f1-7b74afb82324-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.488367 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4574f6e3-d697-424c-a9f1-7b74afb82324-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.489176 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4574f6e3-d697-424c-a9f1-7b74afb82324-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.492475 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qmv\" (UniqueName: \"kubernetes.io/projected/4574f6e3-d697-424c-a9f1-7b74afb82324-kube-api-access-z7qmv\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.509556 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4574f6e3-d697-424c-a9f1-7b74afb82324\") " pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.516417 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7569d6d65f-54dks"] Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.535929 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.537565 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.540579 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.540743 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.540915 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.541184 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-nbv9w" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.566882 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.574609 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/877efa5f-4357-4396-8805-729237cd4e8f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.574724 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.574941 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877efa5f-4357-4396-8805-729237cd4e8f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.575075 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/877efa5f-4357-4396-8805-729237cd4e8f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.575217 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/877efa5f-4357-4396-8805-729237cd4e8f-config\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.575344 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/877efa5f-4357-4396-8805-729237cd4e8f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.575447 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd7l8\" (UniqueName: \"kubernetes.io/projected/877efa5f-4357-4396-8805-729237cd4e8f-kube-api-access-dd7l8\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.575553 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/877efa5f-4357-4396-8805-729237cd4e8f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.647465 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.677651 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/877efa5f-4357-4396-8805-729237cd4e8f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.677705 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.677743 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877efa5f-4357-4396-8805-729237cd4e8f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.677785 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/877efa5f-4357-4396-8805-729237cd4e8f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.677825 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/877efa5f-4357-4396-8805-729237cd4e8f-config\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.677863 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/877efa5f-4357-4396-8805-729237cd4e8f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.677895 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd7l8\" (UniqueName: \"kubernetes.io/projected/877efa5f-4357-4396-8805-729237cd4e8f-kube-api-access-dd7l8\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.677941 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/877efa5f-4357-4396-8805-729237cd4e8f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.678839 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/877efa5f-4357-4396-8805-729237cd4e8f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.678955 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.680344 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/877efa5f-4357-4396-8805-729237cd4e8f-config\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.680765 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/877efa5f-4357-4396-8805-729237cd4e8f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.683284 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/877efa5f-4357-4396-8805-729237cd4e8f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.683768 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/877efa5f-4357-4396-8805-729237cd4e8f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.685022 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877efa5f-4357-4396-8805-729237cd4e8f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.695071 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd7l8\" (UniqueName: \"kubernetes.io/projected/877efa5f-4357-4396-8805-729237cd4e8f-kube-api-access-dd7l8\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.702386 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"877efa5f-4357-4396-8805-729237cd4e8f\") " pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:26 crc kubenswrapper[4873]: I0219 10:00:26.878384 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:27 crc kubenswrapper[4873]: E0219 10:00:27.072495 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 10:00:27 crc kubenswrapper[4873]: E0219 10:00:27.072753 4873 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 10:00:27 crc kubenswrapper[4873]: E0219 10:00:27.073459 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.20:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r4ddx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7bb9bf987-bjckx_openstack(bea1b96b-f9da-4733-a537-a536ec66edc0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 10:00:27 crc kubenswrapper[4873]: E0219 10:00:27.075029 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" podUID="bea1b96b-f9da-4733-a537-a536ec66edc0" Feb 19 10:00:27 crc kubenswrapper[4873]: E0219 10:00:27.218802 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 10:00:27 crc kubenswrapper[4873]: E0219 10:00:27.219050 4873 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Feb 19 10:00:27 crc kubenswrapper[4873]: E0219 10:00:27.219165 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.20:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nnswl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57b9d58665-gfr42_openstack(b3a77d5e-b932-466f-a391-983ffef7c5ae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 10:00:27 crc kubenswrapper[4873]: E0219 10:00:27.220445 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57b9d58665-gfr42" podUID="b3a77d5e-b932-466f-a391-983ffef7c5ae" Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.297516 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9251ac9a-275e-4622-83a2-121d59ec8cd1","Type":"ContainerStarted","Data":"7661fe6352a716a9db14456953448866e2c9797ab10f540b398fdf6a05d1c0b7"} Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.299003 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" event={"ID":"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7","Type":"ContainerStarted","Data":"f1e538b93b7f75469d9218fb31bc488292aeea03fa4ada6d0bc787cf733da55f"} Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.524023 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.562413 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.571881 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-684f645dc-zkgql"] Feb 19 10:00:27 crc kubenswrapper[4873]: W0219 10:00:27.587932 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1b4e4e4_15bf_4c4d_b7c4_bc3029c32964.slice/crio-1acf8a8ce5d5cd8234f816a3c72ac7a8779e66461a9c47b88ba6efbf92c3914d WatchSource:0}: Error finding container 1acf8a8ce5d5cd8234f816a3c72ac7a8779e66461a9c47b88ba6efbf92c3914d: Status 404 returned error can't find the container with id 1acf8a8ce5d5cd8234f816a3c72ac7a8779e66461a9c47b88ba6efbf92c3914d Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.719397 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.804359 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-dns-svc\") pod \"bea1b96b-f9da-4733-a537-a536ec66edc0\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.804476 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-config\") pod \"bea1b96b-f9da-4733-a537-a536ec66edc0\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.804511 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4ddx\" (UniqueName: \"kubernetes.io/projected/bea1b96b-f9da-4733-a537-a536ec66edc0-kube-api-access-r4ddx\") pod \"bea1b96b-f9da-4733-a537-a536ec66edc0\" (UID: \"bea1b96b-f9da-4733-a537-a536ec66edc0\") " Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.805151 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bea1b96b-f9da-4733-a537-a536ec66edc0" (UID: "bea1b96b-f9da-4733-a537-a536ec66edc0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.805630 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-config" (OuterVolumeSpecName: "config") pod "bea1b96b-f9da-4733-a537-a536ec66edc0" (UID: "bea1b96b-f9da-4733-a537-a536ec66edc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.808887 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea1b96b-f9da-4733-a537-a536ec66edc0-kube-api-access-r4ddx" (OuterVolumeSpecName: "kube-api-access-r4ddx") pod "bea1b96b-f9da-4733-a537-a536ec66edc0" (UID: "bea1b96b-f9da-4733-a537-a536ec66edc0"). InnerVolumeSpecName "kube-api-access-r4ddx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.910552 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.910902 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea1b96b-f9da-4733-a537-a536ec66edc0-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.910917 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4ddx\" (UniqueName: \"kubernetes.io/projected/bea1b96b-f9da-4733-a537-a536ec66edc0-kube-api-access-r4ddx\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.922517 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 10:00:27 crc kubenswrapper[4873]: I0219 10:00:27.961841 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.101842 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vsnt5"] Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.106933 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xw7xl"] Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.128928 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.134608 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 10:00:28 crc kubenswrapper[4873]: W0219 10:00:28.145500 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0ab9d21_0c11_4940_ad43_3e20c46012ad.slice/crio-251b22c5a16ace43b380a7b8daf6ec436f78cf1385f99b88a681a1a553b6aad3 WatchSource:0}: Error finding container 251b22c5a16ace43b380a7b8daf6ec436f78cf1385f99b88a681a1a553b6aad3: Status 404 returned error can't find the container with id 251b22c5a16ace43b380a7b8daf6ec436f78cf1385f99b88a681a1a553b6aad3 Feb 19 10:00:28 crc kubenswrapper[4873]: W0219 10:00:28.145907 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8568b0bc_e3d1_4e4e_8172_bada186b750a.slice/crio-a80672d41aa2943f5cef88b806f5ebd8b5afa23daa0fded90e827fd2b0faf463 WatchSource:0}: Error finding container a80672d41aa2943f5cef88b806f5ebd8b5afa23daa0fded90e827fd2b0faf463: Status 404 returned error can't find the container with id a80672d41aa2943f5cef88b806f5ebd8b5afa23daa0fded90e827fd2b0faf463 Feb 19 10:00:28 crc kubenswrapper[4873]: W0219 10:00:28.147705 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3385c22_baa0_4261_b498_6a09c8768520.slice/crio-f649ecf344cb5c5d5b12ff4d9ba9127518e13b208a1889cf50a73a268363958b WatchSource:0}: Error finding container f649ecf344cb5c5d5b12ff4d9ba9127518e13b208a1889cf50a73a268363958b: Status 404 returned error can't find the container with id f649ecf344cb5c5d5b12ff4d9ba9127518e13b208a1889cf50a73a268363958b Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.185812 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b9d58665-gfr42" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.214518 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a77d5e-b932-466f-a391-983ffef7c5ae-config\") pod \"b3a77d5e-b932-466f-a391-983ffef7c5ae\" (UID: \"b3a77d5e-b932-466f-a391-983ffef7c5ae\") " Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.214683 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnswl\" (UniqueName: \"kubernetes.io/projected/b3a77d5e-b932-466f-a391-983ffef7c5ae-kube-api-access-nnswl\") pod \"b3a77d5e-b932-466f-a391-983ffef7c5ae\" (UID: \"b3a77d5e-b932-466f-a391-983ffef7c5ae\") " Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.215000 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a77d5e-b932-466f-a391-983ffef7c5ae-config" (OuterVolumeSpecName: "config") pod "b3a77d5e-b932-466f-a391-983ffef7c5ae" (UID: "b3a77d5e-b932-466f-a391-983ffef7c5ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.220506 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a77d5e-b932-466f-a391-983ffef7c5ae-kube-api-access-nnswl" (OuterVolumeSpecName: "kube-api-access-nnswl") pod "b3a77d5e-b932-466f-a391-983ffef7c5ae" (UID: "b3a77d5e-b932-466f-a391-983ffef7c5ae"). InnerVolumeSpecName "kube-api-access-nnswl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.306917 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"21bb5d7d-6565-484a-af2d-0edcff2729b3","Type":"ContainerStarted","Data":"5f84bcbc933f4db8f25396e941d733635449ed485576a8055e2cb166754301d7"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.312175 4873 generic.go:334] "Generic (PLEG): container finished" podID="16ae739e-2542-4b44-820b-e08570c825dc" containerID="fd5ed21757630d8854c09406757934d793b6ff121a9eac4b399434519950fdf8" exitCode=0 Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.312263 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684f645dc-zkgql" event={"ID":"16ae739e-2542-4b44-820b-e08570c825dc","Type":"ContainerDied","Data":"fd5ed21757630d8854c09406757934d793b6ff121a9eac4b399434519950fdf8"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.312281 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684f645dc-zkgql" event={"ID":"16ae739e-2542-4b44-820b-e08570c825dc","Type":"ContainerStarted","Data":"4d2d1d654b551f3e3bb135ca0685d9199fac6f3f5d189b5a1555a413a88b5ddc"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.315837 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnswl\" (UniqueName: \"kubernetes.io/projected/b3a77d5e-b932-466f-a391-983ffef7c5ae-kube-api-access-nnswl\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.315859 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a77d5e-b932-466f-a391-983ffef7c5ae-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.315959 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b9d58665-gfr42" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.315950 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b9d58665-gfr42" event={"ID":"b3a77d5e-b932-466f-a391-983ffef7c5ae","Type":"ContainerDied","Data":"909aed776126287df1a7798864d3d0881f670c7df611ac6ef1496c9f130ee423"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.328003 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2","Type":"ContainerStarted","Data":"309ae73cd35faf85f8404bbe1f172f13da31fd15960c65b167f5d2c3610e2a86"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.331449 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" event={"ID":"bea1b96b-f9da-4733-a537-a536ec66edc0","Type":"ContainerDied","Data":"68a3e971085bdee9176e4f2a37706001ac7c9cd7629296c14068391c6eb2512d"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.331547 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb9bf987-bjckx" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.337272 4873 generic.go:334] "Generic (PLEG): container finished" podID="040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" containerID="505d554bc6a88454d3df3439ef0d84b488679b7ec3847d5f4302a334f4220e6d" exitCode=0 Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.337343 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" event={"ID":"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7","Type":"ContainerDied","Data":"505d554bc6a88454d3df3439ef0d84b488679b7ec3847d5f4302a334f4220e6d"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.347873 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.350234 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xw7xl" event={"ID":"8568b0bc-e3d1-4e4e-8172-bada186b750a","Type":"ContainerStarted","Data":"a80672d41aa2943f5cef88b806f5ebd8b5afa23daa0fded90e827fd2b0faf463"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.382332 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"86685946-19ac-434a-974f-99b5beeda172","Type":"ContainerStarted","Data":"ef991a861997941a147c9b5a0da440f69f41ed8b1c1a849520b30accb3784df6"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.409307 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsnt5" event={"ID":"b0ab9d21-0c11-4940-ad43-3e20c46012ad","Type":"ContainerStarted","Data":"251b22c5a16ace43b380a7b8daf6ec436f78cf1385f99b88a681a1a553b6aad3"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.419950 4873 generic.go:334] "Generic (PLEG): container finished" podID="d736e93a-6a36-458e-a8f4-a9d511530043" containerID="11d98b54d3d242cf39469162eef86399c7ac915ea837018032d2240dd76c8ed3" exitCode=0 Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.420012 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" event={"ID":"d736e93a-6a36-458e-a8f4-a9d511530043","Type":"ContainerDied","Data":"11d98b54d3d242cf39469162eef86399c7ac915ea837018032d2240dd76c8ed3"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.422505 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5224ec80-b354-467f-b660-2d22b9725be0","Type":"ContainerStarted","Data":"e0fab87f6d902a58d41b4b35cef6645c9197dee8f59fc04defe1aac4065e472b"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.423920 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3385c22-baa0-4261-b498-6a09c8768520","Type":"ContainerStarted","Data":"f649ecf344cb5c5d5b12ff4d9ba9127518e13b208a1889cf50a73a268363958b"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.424678 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964","Type":"ContainerStarted","Data":"1acf8a8ce5d5cd8234f816a3c72ac7a8779e66461a9c47b88ba6efbf92c3914d"} Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.501035 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b9d58665-gfr42"] Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.514416 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57b9d58665-gfr42"] Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.530363 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bb9bf987-bjckx"] Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.537034 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bb9bf987-bjckx"] Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.549989 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 10:00:28 crc kubenswrapper[4873]: E0219 10:00:28.639598 4873 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 19 10:00:28 crc kubenswrapper[4873]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 19 10:00:28 crc kubenswrapper[4873]: > podSandboxID="f1e538b93b7f75469d9218fb31bc488292aeea03fa4ada6d0bc787cf733da55f" Feb 19 10:00:28 crc kubenswrapper[4873]: E0219 10:00:28.639772 4873 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 10:00:28 crc kubenswrapper[4873]: container &Container{Name:dnsmasq-dns,Image:38.102.83.20:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n684h65fh56h6fh87h85h57h76h5b7h94hffh649hfbh8ch5bch56fh5c5hbh86hf9h99h5dch95h66hd5h555h566h646h546h79h9dh55dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6pcf5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7569d6d65f-54dks_openstack(040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 19 10:00:28 crc kubenswrapper[4873]: > logger="UnhandledError" Feb 19 10:00:28 crc kubenswrapper[4873]: E0219 10:00:28.640957 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" podUID="040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.757614 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.833017 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-config\") pod \"16ae739e-2542-4b44-820b-e08570c825dc\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.833254 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-dns-svc\") pod \"16ae739e-2542-4b44-820b-e08570c825dc\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.833308 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlb64\" (UniqueName: \"kubernetes.io/projected/16ae739e-2542-4b44-820b-e08570c825dc-kube-api-access-jlb64\") pod \"16ae739e-2542-4b44-820b-e08570c825dc\" (UID: \"16ae739e-2542-4b44-820b-e08570c825dc\") " Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.839395 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ae739e-2542-4b44-820b-e08570c825dc-kube-api-access-jlb64" (OuterVolumeSpecName: "kube-api-access-jlb64") pod "16ae739e-2542-4b44-820b-e08570c825dc" (UID: "16ae739e-2542-4b44-820b-e08570c825dc"). InnerVolumeSpecName "kube-api-access-jlb64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.855978 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "16ae739e-2542-4b44-820b-e08570c825dc" (UID: "16ae739e-2542-4b44-820b-e08570c825dc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.857563 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-config" (OuterVolumeSpecName: "config") pod "16ae739e-2542-4b44-820b-e08570c825dc" (UID: "16ae739e-2542-4b44-820b-e08570c825dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.935204 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.935244 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16ae739e-2542-4b44-820b-e08570c825dc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:28 crc kubenswrapper[4873]: I0219 10:00:28.935257 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlb64\" (UniqueName: \"kubernetes.io/projected/16ae739e-2542-4b44-820b-e08570c825dc-kube-api-access-jlb64\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.434202 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.439085 4873 generic.go:334] "Generic (PLEG): container finished" podID="8568b0bc-e3d1-4e4e-8172-bada186b750a" containerID="dcfd39b1c5289cd4a8556f216e399630fd4927789a8b353b5566c5719ea3fcee" exitCode=0 Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.439131 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xw7xl" event={"ID":"8568b0bc-e3d1-4e4e-8172-bada186b750a","Type":"ContainerDied","Data":"dcfd39b1c5289cd4a8556f216e399630fd4927789a8b353b5566c5719ea3fcee"} Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.440852 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684f645dc-zkgql" event={"ID":"16ae739e-2542-4b44-820b-e08570c825dc","Type":"ContainerDied","Data":"4d2d1d654b551f3e3bb135ca0685d9199fac6f3f5d189b5a1555a413a88b5ddc"} Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.440879 4873 scope.go:117] "RemoveContainer" containerID="fd5ed21757630d8854c09406757934d793b6ff121a9eac4b399434519950fdf8" Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.440906 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684f645dc-zkgql" Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.445559 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" event={"ID":"d736e93a-6a36-458e-a8f4-a9d511530043","Type":"ContainerStarted","Data":"7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750"} Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.445683 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.446935 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"877efa5f-4357-4396-8805-729237cd4e8f","Type":"ContainerStarted","Data":"5b8dfcc8863027970b2baffa1cde2eca756b71ec863a27bcadcbef578596d8c4"} Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.448888 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0b46b116-4858-4b6a-b3ad-9337272f9a91","Type":"ContainerStarted","Data":"3d0b7f98084ff77ff34a64d3b9fb32fc7993ea571d51b6cb0b24962f0fd5c9ef"} Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.477451 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" podStartSLOduration=15.759529305000001 podStartE2EDuration="17.477434931s" podCreationTimestamp="2026-02-19 10:00:12 +0000 UTC" firstStartedPulling="2026-02-19 10:00:26.032090755 +0000 UTC m=+935.321522403" lastFinishedPulling="2026-02-19 10:00:27.749996401 +0000 UTC m=+937.039428029" observedRunningTime="2026-02-19 10:00:29.473968307 +0000 UTC m=+938.763399955" watchObservedRunningTime="2026-02-19 10:00:29.477434931 +0000 UTC m=+938.766866569" Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.495406 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3a77d5e-b932-466f-a391-983ffef7c5ae" path="/var/lib/kubelet/pods/b3a77d5e-b932-466f-a391-983ffef7c5ae/volumes" Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.495861 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea1b96b-f9da-4733-a537-a536ec66edc0" path="/var/lib/kubelet/pods/bea1b96b-f9da-4733-a537-a536ec66edc0/volumes" Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.527150 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-684f645dc-zkgql"] Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.534485 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-684f645dc-zkgql"] Feb 19 10:00:29 crc kubenswrapper[4873]: I0219 10:00:29.583372 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-t5bgp"] Feb 19 10:00:30 crc kubenswrapper[4873]: I0219 10:00:30.454868 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4574f6e3-d697-424c-a9f1-7b74afb82324","Type":"ContainerStarted","Data":"1e894480148b571b97ed6e0b1b55886de852c96f7c290e0516e153d29b3da37a"} Feb 19 10:00:30 crc kubenswrapper[4873]: I0219 10:00:30.455895 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t5bgp" event={"ID":"de2f2331-fc83-420b-9e1b-fe08998cb0ab","Type":"ContainerStarted","Data":"25bf411f87949538bc230bd0926ab6bb33b5472b677c8f8ea482f04a7149e1b4"} Feb 19 10:00:31 crc kubenswrapper[4873]: I0219 10:00:31.498889 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ae739e-2542-4b44-820b-e08570c825dc" path="/var/lib/kubelet/pods/16ae739e-2542-4b44-820b-e08570c825dc/volumes" Feb 19 10:00:33 crc kubenswrapper[4873]: I0219 10:00:33.177349 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:00:33 crc kubenswrapper[4873]: I0219 10:00:33.227759 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7569d6d65f-54dks"] Feb 19 10:00:36 crc kubenswrapper[4873]: I0219 10:00:36.516807 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" event={"ID":"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7","Type":"ContainerStarted","Data":"097bfd5b65d03aa6a20da3832add63d24bee8dabfbbd70df5da33cb07c8feac3"} Feb 19 10:00:36 crc kubenswrapper[4873]: I0219 10:00:36.517532 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:36 crc kubenswrapper[4873]: I0219 10:00:36.517294 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" podUID="040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" containerName="dnsmasq-dns" containerID="cri-o://097bfd5b65d03aa6a20da3832add63d24bee8dabfbbd70df5da33cb07c8feac3" gracePeriod=10 Feb 19 10:00:36 crc kubenswrapper[4873]: I0219 10:00:36.539088 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" podStartSLOduration=23.861208922 podStartE2EDuration="24.539041486s" podCreationTimestamp="2026-02-19 10:00:12 +0000 UTC" firstStartedPulling="2026-02-19 10:00:27.073299335 +0000 UTC m=+936.362730973" lastFinishedPulling="2026-02-19 10:00:27.751131899 +0000 UTC m=+937.040563537" observedRunningTime="2026-02-19 10:00:36.535673053 +0000 UTC m=+945.825104761" watchObservedRunningTime="2026-02-19 10:00:36.539041486 +0000 UTC m=+945.828473134" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.527975 4873 generic.go:334] "Generic (PLEG): container finished" podID="8568b0bc-e3d1-4e4e-8172-bada186b750a" containerID="8997664180cc4a89bc96536538fcab96c359ac683add2f689bc03e6ec23ed7ec" exitCode=0 Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.528243 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xw7xl" event={"ID":"8568b0bc-e3d1-4e4e-8172-bada186b750a","Type":"ContainerDied","Data":"8997664180cc4a89bc96536538fcab96c359ac683add2f689bc03e6ec23ed7ec"} Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.530579 4873 generic.go:334] "Generic (PLEG): container finished" podID="040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" containerID="097bfd5b65d03aa6a20da3832add63d24bee8dabfbbd70df5da33cb07c8feac3" exitCode=0 Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.530615 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" event={"ID":"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7","Type":"ContainerDied","Data":"097bfd5b65d03aa6a20da3832add63d24bee8dabfbbd70df5da33cb07c8feac3"} Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.555338 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lmkgp"] Feb 19 10:00:37 crc kubenswrapper[4873]: E0219 10:00:37.555803 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ae739e-2542-4b44-820b-e08570c825dc" containerName="init" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.555830 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ae739e-2542-4b44-820b-e08570c825dc" containerName="init" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.556033 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ae739e-2542-4b44-820b-e08570c825dc" containerName="init" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.557806 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.582929 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmkgp"] Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.611423 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl4z6\" (UniqueName: \"kubernetes.io/projected/d115a791-c703-4c6e-91e5-8f3ab9608277-kube-api-access-pl4z6\") pod \"redhat-operators-lmkgp\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.611683 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-catalog-content\") pod \"redhat-operators-lmkgp\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.611872 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-utilities\") pod \"redhat-operators-lmkgp\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.713079 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl4z6\" (UniqueName: \"kubernetes.io/projected/d115a791-c703-4c6e-91e5-8f3ab9608277-kube-api-access-pl4z6\") pod \"redhat-operators-lmkgp\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.713196 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-catalog-content\") pod \"redhat-operators-lmkgp\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.713274 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-utilities\") pod \"redhat-operators-lmkgp\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.713947 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-utilities\") pod \"redhat-operators-lmkgp\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.714618 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-catalog-content\") pod \"redhat-operators-lmkgp\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.756412 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl4z6\" (UniqueName: \"kubernetes.io/projected/d115a791-c703-4c6e-91e5-8f3ab9608277-kube-api-access-pl4z6\") pod \"redhat-operators-lmkgp\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:00:37 crc kubenswrapper[4873]: I0219 10:00:37.902160 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:41.946896 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sjwbx"] Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:41.951528 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:41.970919 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjwbx"] Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:42.090796 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v2hw\" (UniqueName: \"kubernetes.io/projected/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-kube-api-access-4v2hw\") pod \"certified-operators-sjwbx\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:42.090907 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-utilities\") pod \"certified-operators-sjwbx\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:42.090949 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-catalog-content\") pod \"certified-operators-sjwbx\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:42.192710 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v2hw\" (UniqueName: \"kubernetes.io/projected/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-kube-api-access-4v2hw\") pod \"certified-operators-sjwbx\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:42.192825 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-utilities\") pod \"certified-operators-sjwbx\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:42.192870 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-catalog-content\") pod \"certified-operators-sjwbx\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:42.193493 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-catalog-content\") pod \"certified-operators-sjwbx\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:42.193531 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-utilities\") pod \"certified-operators-sjwbx\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:42.218514 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v2hw\" (UniqueName: \"kubernetes.io/projected/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-kube-api-access-4v2hw\") pod \"certified-operators-sjwbx\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:42.279536 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:45.599900 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t5bgp" event={"ID":"de2f2331-fc83-420b-9e1b-fe08998cb0ab","Type":"ContainerStarted","Data":"d33b00dc248cbb8c3025337ed0ac581a594ca6f6603cb02e194746a85f320725"} Feb 19 10:00:45 crc kubenswrapper[4873]: E0219 10:00:45.810077 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 19 10:00:45 crc kubenswrapper[4873]: E0219 10:00:45.810135 4873 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 19 10:00:45 crc kubenswrapper[4873]: E0219 10:00:45.810254 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7fb6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(5224ec80-b354-467f-b660-2d22b9725be0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 10:00:45 crc kubenswrapper[4873]: E0219 10:00:45.811604 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="5224ec80-b354-467f-b660-2d22b9725be0" Feb 19 10:00:45 crc kubenswrapper[4873]: I0219 10:00:45.869568 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.056771 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-dns-svc\") pod \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.057040 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-config\") pod \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.057446 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pcf5\" (UniqueName: \"kubernetes.io/projected/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-kube-api-access-6pcf5\") pod \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\" (UID: \"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7\") " Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.066084 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-kube-api-access-6pcf5" (OuterVolumeSpecName: "kube-api-access-6pcf5") pod "040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" (UID: "040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7"). InnerVolumeSpecName "kube-api-access-6pcf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.159711 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pcf5\" (UniqueName: \"kubernetes.io/projected/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-kube-api-access-6pcf5\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.196266 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-config" (OuterVolumeSpecName: "config") pod "040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" (UID: "040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.210787 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" (UID: "040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.261280 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.261308 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.450719 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmkgp"] Feb 19 10:00:46 crc kubenswrapper[4873]: W0219 10:00:46.489317 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd115a791_c703_4c6e_91e5_8f3ab9608277.slice/crio-164021a9f61ae1ea4080d3f61899481761a6227bd5066ab4023318b78119f680 WatchSource:0}: Error finding container 164021a9f61ae1ea4080d3f61899481761a6227bd5066ab4023318b78119f680: Status 404 returned error can't find the container with id 164021a9f61ae1ea4080d3f61899481761a6227bd5066ab4023318b78119f680 Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.562203 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjwbx"] Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.610737 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" event={"ID":"040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7","Type":"ContainerDied","Data":"f1e538b93b7f75469d9218fb31bc488292aeea03fa4ada6d0bc787cf733da55f"} Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.611886 4873 scope.go:117] "RemoveContainer" containerID="097bfd5b65d03aa6a20da3832add63d24bee8dabfbbd70df5da33cb07c8feac3" Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.612014 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmkgp" event={"ID":"d115a791-c703-4c6e-91e5-8f3ab9608277","Type":"ContainerStarted","Data":"164021a9f61ae1ea4080d3f61899481761a6227bd5066ab4023318b78119f680"} Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.610808 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.613164 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjwbx" event={"ID":"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e","Type":"ContainerStarted","Data":"6ceab5ac0f605f81c09fcddc9dd0cf16ee123c3875c132c11dd895633c1f969a"} Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.615422 4873 generic.go:334] "Generic (PLEG): container finished" podID="de2f2331-fc83-420b-9e1b-fe08998cb0ab" containerID="d33b00dc248cbb8c3025337ed0ac581a594ca6f6603cb02e194746a85f320725" exitCode=0 Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.616552 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t5bgp" event={"ID":"de2f2331-fc83-420b-9e1b-fe08998cb0ab","Type":"ContainerDied","Data":"d33b00dc248cbb8c3025337ed0ac581a594ca6f6603cb02e194746a85f320725"} Feb 19 10:00:46 crc kubenswrapper[4873]: E0219 10:00:46.619385 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="5224ec80-b354-467f-b660-2d22b9725be0" Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.645289 4873 scope.go:117] "RemoveContainer" containerID="505d554bc6a88454d3df3439ef0d84b488679b7ec3847d5f4302a334f4220e6d" Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.684260 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7569d6d65f-54dks"] Feb 19 10:00:46 crc kubenswrapper[4873]: I0219 10:00:46.692188 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7569d6d65f-54dks"] Feb 19 10:00:47 crc kubenswrapper[4873]: I0219 10:00:47.499250 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" path="/var/lib/kubelet/pods/040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7/volumes" Feb 19 10:00:47 crc kubenswrapper[4873]: I0219 10:00:47.628710 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4574f6e3-d697-424c-a9f1-7b74afb82324","Type":"ContainerStarted","Data":"20836d279f06285002cfbb0c8639e85b75eb8d99b8e51e3132bdeffd61ebf0bd"} Feb 19 10:00:47 crc kubenswrapper[4873]: I0219 10:00:47.632513 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"21bb5d7d-6565-484a-af2d-0edcff2729b3","Type":"ContainerStarted","Data":"14c21ce2634b1dc213dc0c0ab1a58e021e2dc9c23001d06f62c02e78efddce27"} Feb 19 10:00:47 crc kubenswrapper[4873]: I0219 10:00:47.633333 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 10:00:47 crc kubenswrapper[4873]: I0219 10:00:47.639111 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"877efa5f-4357-4396-8805-729237cd4e8f","Type":"ContainerStarted","Data":"71594272d5055445ca8570411417082fd82727b5b63787509b7b2e366874f623"} Feb 19 10:00:47 crc kubenswrapper[4873]: I0219 10:00:47.643921 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3385c22-baa0-4261-b498-6a09c8768520","Type":"ContainerStarted","Data":"6e3e72cbe1386e2212554c075c732c997cd9eca5e8e80b367a0a82589ed0ceb4"} Feb 19 10:00:47 crc kubenswrapper[4873]: I0219 10:00:47.647238 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964","Type":"ContainerStarted","Data":"f86fddfaec0875e79580c640fa9abc158961398e84a30e8571f5d4f75941dd57"} Feb 19 10:00:47 crc kubenswrapper[4873]: I0219 10:00:47.649603 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=24.468808351 podStartE2EDuration="31.649591921s" podCreationTimestamp="2026-02-19 10:00:16 +0000 UTC" firstStartedPulling="2026-02-19 10:00:27.943205803 +0000 UTC m=+937.232637441" lastFinishedPulling="2026-02-19 10:00:35.123989333 +0000 UTC m=+944.413421011" observedRunningTime="2026-02-19 10:00:47.647681574 +0000 UTC m=+956.937113212" watchObservedRunningTime="2026-02-19 10:00:47.649591921 +0000 UTC m=+956.939023549" Feb 19 10:00:47 crc kubenswrapper[4873]: I0219 10:00:47.738425 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7569d6d65f-54dks" podUID="040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.99:5353: i/o timeout" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.127213 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9hfhg"] Feb 19 10:00:48 crc kubenswrapper[4873]: E0219 10:00:48.127756 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" containerName="init" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.127774 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" containerName="init" Feb 19 10:00:48 crc kubenswrapper[4873]: E0219 10:00:48.127790 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" containerName="dnsmasq-dns" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.127796 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" containerName="dnsmasq-dns" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.127932 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="040a64d2-c5d3-44b7-9ffd-8bd12c3d68b7" containerName="dnsmasq-dns" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.131126 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.148827 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hfhg"] Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.240416 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.240470 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.240504 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.240860 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e9052ea8663914dbd7738866b6f51c9865aab9ba0562919ffd7db3fb01e7ded"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.240906 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://4e9052ea8663914dbd7738866b6f51c9865aab9ba0562919ffd7db3fb01e7ded" gracePeriod=600 Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.302475 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-utilities\") pod \"community-operators-9hfhg\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.302563 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-catalog-content\") pod \"community-operators-9hfhg\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.302614 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-772f4\" (UniqueName: \"kubernetes.io/projected/f20bcc70-bf30-4949-951a-b36d083d205f-kube-api-access-772f4\") pod \"community-operators-9hfhg\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.403743 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-utilities\") pod \"community-operators-9hfhg\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.403854 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-catalog-content\") pod \"community-operators-9hfhg\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.403893 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-772f4\" (UniqueName: \"kubernetes.io/projected/f20bcc70-bf30-4949-951a-b36d083d205f-kube-api-access-772f4\") pod \"community-operators-9hfhg\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.404176 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-utilities\") pod \"community-operators-9hfhg\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.404874 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-catalog-content\") pod \"community-operators-9hfhg\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.495079 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-772f4\" (UniqueName: \"kubernetes.io/projected/f20bcc70-bf30-4949-951a-b36d083d205f-kube-api-access-772f4\") pod \"community-operators-9hfhg\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.656338 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"86685946-19ac-434a-974f-99b5beeda172","Type":"ContainerStarted","Data":"853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb"} Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.658768 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsnt5" event={"ID":"b0ab9d21-0c11-4940-ad43-3e20c46012ad","Type":"ContainerStarted","Data":"c5457889bdf77cfda38fa2f89068415a1f93d64bf677bfcf5b9afd8ebe657440"} Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.658857 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vsnt5" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.660216 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2","Type":"ContainerStarted","Data":"57567e0456fd2d45349518902ef81c44525ff9ae50ef5a0fa8ed4d2a66526532"} Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.665462 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9251ac9a-275e-4622-83a2-121d59ec8cd1","Type":"ContainerStarted","Data":"190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f"} Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.680615 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="4e9052ea8663914dbd7738866b6f51c9865aab9ba0562919ffd7db3fb01e7ded" exitCode=0 Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.680788 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"4e9052ea8663914dbd7738866b6f51c9865aab9ba0562919ffd7db3fb01e7ded"} Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.680852 4873 scope.go:117] "RemoveContainer" containerID="025da7fd171f987961d862fe4ebef489eca80227003392ad78806aa501904663" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.717964 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vsnt5" podStartSLOduration=18.792274133 podStartE2EDuration="26.71794504s" podCreationTimestamp="2026-02-19 10:00:22 +0000 UTC" firstStartedPulling="2026-02-19 10:00:28.148159694 +0000 UTC m=+937.437591322" lastFinishedPulling="2026-02-19 10:00:36.073830571 +0000 UTC m=+945.363262229" observedRunningTime="2026-02-19 10:00:48.712906795 +0000 UTC m=+958.002338433" watchObservedRunningTime="2026-02-19 10:00:48.71794504 +0000 UTC m=+958.007376678" Feb 19 10:00:48 crc kubenswrapper[4873]: I0219 10:00:48.744189 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:00:49 crc kubenswrapper[4873]: I0219 10:00:49.699420 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t5bgp" event={"ID":"de2f2331-fc83-420b-9e1b-fe08998cb0ab","Type":"ContainerStarted","Data":"6120178479b9fb316a67f93c74ba7e6f2722b43b8d9831ee092b8a0f17b21fb1"} Feb 19 10:00:49 crc kubenswrapper[4873]: I0219 10:00:49.715595 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0b46b116-4858-4b6a-b3ad-9337272f9a91","Type":"ContainerStarted","Data":"7b07d5d4936c52da51ddc101b9f0cc93c881b6b0ec5f359e20ad961f5451f0a9"} Feb 19 10:00:49 crc kubenswrapper[4873]: I0219 10:00:49.729387 4873 generic.go:334] "Generic (PLEG): container finished" podID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerID="c3c32c24ff9ddc9c878bf60c4e06dc7e24a6feab7886836d8ecf2510f7a2f602" exitCode=0 Feb 19 10:00:49 crc kubenswrapper[4873]: I0219 10:00:49.729455 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmkgp" event={"ID":"d115a791-c703-4c6e-91e5-8f3ab9608277","Type":"ContainerDied","Data":"c3c32c24ff9ddc9c878bf60c4e06dc7e24a6feab7886836d8ecf2510f7a2f602"} Feb 19 10:00:49 crc kubenswrapper[4873]: I0219 10:00:49.732688 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xw7xl" event={"ID":"8568b0bc-e3d1-4e4e-8172-bada186b750a","Type":"ContainerStarted","Data":"c66f926167963658837072f9619be2e502f49f78fdee38e1038232ea2b12e0fe"} Feb 19 10:00:49 crc kubenswrapper[4873]: I0219 10:00:49.744807 4873 generic.go:334] "Generic (PLEG): container finished" podID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" containerID="94b88ff2b105134857189f15d457bd06c0d3247317f372402ca302f0541cc41d" exitCode=0 Feb 19 10:00:49 crc kubenswrapper[4873]: I0219 10:00:49.745818 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjwbx" event={"ID":"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e","Type":"ContainerDied","Data":"94b88ff2b105134857189f15d457bd06c0d3247317f372402ca302f0541cc41d"} Feb 19 10:00:49 crc kubenswrapper[4873]: I0219 10:00:49.799719 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xw7xl" podStartSLOduration=9.387809768 podStartE2EDuration="25.799703109s" podCreationTimestamp="2026-02-19 10:00:24 +0000 UTC" firstStartedPulling="2026-02-19 10:00:29.871152504 +0000 UTC m=+939.160584142" lastFinishedPulling="2026-02-19 10:00:46.283045845 +0000 UTC m=+955.572477483" observedRunningTime="2026-02-19 10:00:49.789286812 +0000 UTC m=+959.078718450" watchObservedRunningTime="2026-02-19 10:00:49.799703109 +0000 UTC m=+959.089134747" Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.061317 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hfhg"] Feb 19 10:00:50 crc kubenswrapper[4873]: W0219 10:00:50.069031 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf20bcc70_bf30_4949_951a_b36d083d205f.slice/crio-d368a20b1ccdc9da62de20626c013e559624ef41b56690236309ce9d1a2a14ac WatchSource:0}: Error finding container d368a20b1ccdc9da62de20626c013e559624ef41b56690236309ce9d1a2a14ac: Status 404 returned error can't find the container with id d368a20b1ccdc9da62de20626c013e559624ef41b56690236309ce9d1a2a14ac Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.752685 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"877efa5f-4357-4396-8805-729237cd4e8f","Type":"ContainerStarted","Data":"68f113e2f0c70661921c5b2f008496a1db957fc6375a87458aa5aeb82be012c9"} Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.754533 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t5bgp" event={"ID":"de2f2331-fc83-420b-9e1b-fe08998cb0ab","Type":"ContainerStarted","Data":"5924680f305955cfe970df85dca208e136dd35437f00770b44c5859330aba705"} Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.754683 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.754706 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.755500 4873 generic.go:334] "Generic (PLEG): container finished" podID="f20bcc70-bf30-4949-951a-b36d083d205f" containerID="eba7f3213f6b8b4ed8b7cfa357b6cb425beb34251cc11a43d090e87e4e2033e9" exitCode=0 Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.755543 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hfhg" event={"ID":"f20bcc70-bf30-4949-951a-b36d083d205f","Type":"ContainerDied","Data":"eba7f3213f6b8b4ed8b7cfa357b6cb425beb34251cc11a43d090e87e4e2033e9"} Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.755558 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hfhg" event={"ID":"f20bcc70-bf30-4949-951a-b36d083d205f","Type":"ContainerStarted","Data":"d368a20b1ccdc9da62de20626c013e559624ef41b56690236309ce9d1a2a14ac"} Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.757691 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"4cf449f514dc24e840144e6f6decb8f1a064252cdbd9c34d791686fe659362f0"} Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.760138 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4574f6e3-d697-424c-a9f1-7b74afb82324","Type":"ContainerStarted","Data":"93d09717c75b6be2c36343780e551d316d65d55535537a0805a9c6228cfa4fdc"} Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.775915 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.764270632 podStartE2EDuration="25.775894016s" podCreationTimestamp="2026-02-19 10:00:25 +0000 UTC" firstStartedPulling="2026-02-19 10:00:28.557746708 +0000 UTC m=+937.847178346" lastFinishedPulling="2026-02-19 10:00:49.569369902 +0000 UTC m=+958.858801730" observedRunningTime="2026-02-19 10:00:50.774496991 +0000 UTC m=+960.063928629" watchObservedRunningTime="2026-02-19 10:00:50.775894016 +0000 UTC m=+960.065325654" Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.795837 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-t5bgp" podStartSLOduration=22.601987528 podStartE2EDuration="28.795819447s" podCreationTimestamp="2026-02-19 10:00:22 +0000 UTC" firstStartedPulling="2026-02-19 10:00:29.88152922 +0000 UTC m=+939.170960858" lastFinishedPulling="2026-02-19 10:00:36.075361129 +0000 UTC m=+945.364792777" observedRunningTime="2026-02-19 10:00:50.793961391 +0000 UTC m=+960.083393029" watchObservedRunningTime="2026-02-19 10:00:50.795819447 +0000 UTC m=+960.085251085" Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.831739 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.115661104 podStartE2EDuration="25.831725182s" podCreationTimestamp="2026-02-19 10:00:25 +0000 UTC" firstStartedPulling="2026-02-19 10:00:29.88315806 +0000 UTC m=+939.172589688" lastFinishedPulling="2026-02-19 10:00:49.599222128 +0000 UTC m=+958.888653766" observedRunningTime="2026-02-19 10:00:50.828140843 +0000 UTC m=+960.117572481" watchObservedRunningTime="2026-02-19 10:00:50.831725182 +0000 UTC m=+960.121156820" Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.879114 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:50 crc kubenswrapper[4873]: I0219 10:00:50.925670 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:51 crc kubenswrapper[4873]: I0219 10:00:51.648023 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:51 crc kubenswrapper[4873]: I0219 10:00:51.769340 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmkgp" event={"ID":"d115a791-c703-4c6e-91e5-8f3ab9608277","Type":"ContainerStarted","Data":"ddf2f2a59be91b05f4a24a87c32978d8341502e53c5702e5bb9180a41d34ff6f"} Feb 19 10:00:51 crc kubenswrapper[4873]: I0219 10:00:51.772743 4873 generic.go:334] "Generic (PLEG): container finished" podID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" containerID="26c7c425cb063e2664738a405a0bc48114123eb086c08ef61f83247d0e893cd3" exitCode=0 Feb 19 10:00:51 crc kubenswrapper[4873]: I0219 10:00:51.772781 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjwbx" event={"ID":"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e","Type":"ContainerDied","Data":"26c7c425cb063e2664738a405a0bc48114123eb086c08ef61f83247d0e893cd3"} Feb 19 10:00:51 crc kubenswrapper[4873]: I0219 10:00:51.774230 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:52 crc kubenswrapper[4873]: I0219 10:00:52.214732 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 10:00:52 crc kubenswrapper[4873]: I0219 10:00:52.782712 4873 generic.go:334] "Generic (PLEG): container finished" podID="f20bcc70-bf30-4949-951a-b36d083d205f" containerID="35d25dc52ac46707a9ed600e3f35ef54061a263d12693fd1d59a80b0fdce1fe0" exitCode=0 Feb 19 10:00:52 crc kubenswrapper[4873]: I0219 10:00:52.782776 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hfhg" event={"ID":"f20bcc70-bf30-4949-951a-b36d083d205f","Type":"ContainerDied","Data":"35d25dc52ac46707a9ed600e3f35ef54061a263d12693fd1d59a80b0fdce1fe0"} Feb 19 10:00:52 crc kubenswrapper[4873]: I0219 10:00:52.784619 4873 generic.go:334] "Generic (PLEG): container finished" podID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerID="ddf2f2a59be91b05f4a24a87c32978d8341502e53c5702e5bb9180a41d34ff6f" exitCode=0 Feb 19 10:00:52 crc kubenswrapper[4873]: I0219 10:00:52.784643 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmkgp" event={"ID":"d115a791-c703-4c6e-91e5-8f3ab9608277","Type":"ContainerDied","Data":"ddf2f2a59be91b05f4a24a87c32978d8341502e53c5702e5bb9180a41d34ff6f"} Feb 19 10:00:52 crc kubenswrapper[4873]: I0219 10:00:52.841000 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.162540 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f8656d65-jtnp6"] Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.163990 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.165969 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.178922 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f8656d65-jtnp6"] Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.303240 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-dns-svc\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.303291 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-ovsdbserver-sb\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.303389 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-config\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.303441 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkdz7\" (UniqueName: \"kubernetes.io/projected/7d0de876-c87f-4760-b06f-87b8ff7e5588-kube-api-access-mkdz7\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.370174 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-djxfb"] Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.371150 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.374233 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.404577 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-config\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.404634 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkdz7\" (UniqueName: \"kubernetes.io/projected/7d0de876-c87f-4760-b06f-87b8ff7e5588-kube-api-access-mkdz7\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.404724 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-dns-svc\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.404762 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-ovsdbserver-sb\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.405766 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-ovsdbserver-sb\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.405815 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-config\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.406474 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-dns-svc\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.409641 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-djxfb"] Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.431865 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkdz7\" (UniqueName: \"kubernetes.io/projected/7d0de876-c87f-4760-b06f-87b8ff7e5588-kube-api-access-mkdz7\") pod \"dnsmasq-dns-5f8656d65-jtnp6\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.479256 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.508540 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/888c3336-cd8a-4bf2-805f-6b473fb272f4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.508587 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/888c3336-cd8a-4bf2-805f-6b473fb272f4-combined-ca-bundle\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.508629 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/888c3336-cd8a-4bf2-805f-6b473fb272f4-ovn-rundir\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.508657 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7b9k\" (UniqueName: \"kubernetes.io/projected/888c3336-cd8a-4bf2-805f-6b473fb272f4-kube-api-access-z7b9k\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.508730 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/888c3336-cd8a-4bf2-805f-6b473fb272f4-config\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.508760 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/888c3336-cd8a-4bf2-805f-6b473fb272f4-ovs-rundir\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.610574 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/888c3336-cd8a-4bf2-805f-6b473fb272f4-config\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.610646 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/888c3336-cd8a-4bf2-805f-6b473fb272f4-ovs-rundir\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.610735 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/888c3336-cd8a-4bf2-805f-6b473fb272f4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.610760 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/888c3336-cd8a-4bf2-805f-6b473fb272f4-combined-ca-bundle\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.610810 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/888c3336-cd8a-4bf2-805f-6b473fb272f4-ovn-rundir\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.610849 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7b9k\" (UniqueName: \"kubernetes.io/projected/888c3336-cd8a-4bf2-805f-6b473fb272f4-kube-api-access-z7b9k\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.613648 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/888c3336-cd8a-4bf2-805f-6b473fb272f4-ovs-rundir\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.615894 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/888c3336-cd8a-4bf2-805f-6b473fb272f4-config\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.615972 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/888c3336-cd8a-4bf2-805f-6b473fb272f4-ovn-rundir\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.618905 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/888c3336-cd8a-4bf2-805f-6b473fb272f4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.619032 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/888c3336-cd8a-4bf2-805f-6b473fb272f4-combined-ca-bundle\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.648130 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.649538 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7b9k\" (UniqueName: \"kubernetes.io/projected/888c3336-cd8a-4bf2-805f-6b473fb272f4-kube-api-access-z7b9k\") pod \"ovn-controller-metrics-djxfb\" (UID: \"888c3336-cd8a-4bf2-805f-6b473fb272f4\") " pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.691442 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-djxfb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.770974 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.847678 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f8656d65-jtnp6"] Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.872891 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.889412 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-767bbb56f-v5bpp"] Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.890674 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.895382 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 10:00:53 crc kubenswrapper[4873]: I0219 10:00:53.907710 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-767bbb56f-v5bpp"] Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.036165 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-config\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.036206 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sqkb\" (UniqueName: \"kubernetes.io/projected/d4069c7b-b867-4c6b-b5dd-91529a59d01c-kube-api-access-6sqkb\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.036247 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-nb\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.036314 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-sb\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.036343 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-dns-svc\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.098077 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.099385 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.100971 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.101220 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-9vxxd" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.101253 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.101527 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.138053 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-sb\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.138126 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-dns-svc\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.138171 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-config\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.138195 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sqkb\" (UniqueName: \"kubernetes.io/projected/d4069c7b-b867-4c6b-b5dd-91529a59d01c-kube-api-access-6sqkb\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.138224 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-nb\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.139079 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-nb\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.139081 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-dns-svc\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.139213 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.139616 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-config\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.139941 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-sb\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.163832 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sqkb\" (UniqueName: \"kubernetes.io/projected/d4069c7b-b867-4c6b-b5dd-91529a59d01c-kube-api-access-6sqkb\") pod \"dnsmasq-dns-767bbb56f-v5bpp\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.218312 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.240205 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.240262 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88p26\" (UniqueName: \"kubernetes.io/projected/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-kube-api-access-88p26\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.240288 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-scripts\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.240319 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.240428 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.240457 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.240483 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-config\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.341705 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-scripts\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.341749 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.341843 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.341870 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.341890 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-config\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.341934 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.341957 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88p26\" (UniqueName: \"kubernetes.io/projected/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-kube-api-access-88p26\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.342889 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-scripts\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.343145 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.344511 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-config\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.349693 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.350467 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.352378 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.392145 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88p26\" (UniqueName: \"kubernetes.io/projected/bd6df8e5-8bc5-4bd5-b466-a90642932cc2-kube-api-access-88p26\") pod \"ovn-northd-0\" (UID: \"bd6df8e5-8bc5-4bd5-b466-a90642932cc2\") " pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.412179 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.689214 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.689477 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.736791 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.815079 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-767bbb56f-v5bpp"] Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.825749 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-djxfb"] Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.865499 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.961328 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f8656d65-jtnp6"] Feb 19 10:00:54 crc kubenswrapper[4873]: W0219 10:00:54.965282 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d0de876_c87f_4760_b06f_87b8ff7e5588.slice/crio-27513f9246dbe172eded455fb97103ca558e42e17c318279c0c8d3c8528981cd WatchSource:0}: Error finding container 27513f9246dbe172eded455fb97103ca558e42e17c318279c0c8d3c8528981cd: Status 404 returned error can't find the container with id 27513f9246dbe172eded455fb97103ca558e42e17c318279c0c8d3c8528981cd Feb 19 10:00:54 crc kubenswrapper[4873]: I0219 10:00:54.972700 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 10:00:54 crc kubenswrapper[4873]: W0219 10:00:54.972801 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd6df8e5_8bc5_4bd5_b466_a90642932cc2.slice/crio-bd1e7c563ea6665a05a01dfb3cd5c188624daab075cd1165d88e9b11c768e91a WatchSource:0}: Error finding container bd1e7c563ea6665a05a01dfb3cd5c188624daab075cd1165d88e9b11c768e91a: Status 404 returned error can't find the container with id bd1e7c563ea6665a05a01dfb3cd5c188624daab075cd1165d88e9b11c768e91a Feb 19 10:00:55 crc kubenswrapper[4873]: I0219 10:00:55.811807 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-djxfb" event={"ID":"888c3336-cd8a-4bf2-805f-6b473fb272f4","Type":"ContainerStarted","Data":"6de985f9d9630982be09020970ef7e8fde590171a64d626fb0b58c8f59b3bbc4"} Feb 19 10:00:55 crc kubenswrapper[4873]: I0219 10:00:55.814348 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjwbx" event={"ID":"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e","Type":"ContainerStarted","Data":"e25bfdf167e531e407faddba4519081a8819c0f481dad1ec82bb12868d58279f"} Feb 19 10:00:55 crc kubenswrapper[4873]: I0219 10:00:55.815421 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" event={"ID":"d4069c7b-b867-4c6b-b5dd-91529a59d01c","Type":"ContainerStarted","Data":"be25608adf6f38eb11a1d8fdb4fb6018bdff1784849dbf4c40f2c123fef01c50"} Feb 19 10:00:55 crc kubenswrapper[4873]: I0219 10:00:55.816783 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" event={"ID":"7d0de876-c87f-4760-b06f-87b8ff7e5588","Type":"ContainerStarted","Data":"27513f9246dbe172eded455fb97103ca558e42e17c318279c0c8d3c8528981cd"} Feb 19 10:00:55 crc kubenswrapper[4873]: I0219 10:00:55.818262 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bd6df8e5-8bc5-4bd5-b466-a90642932cc2","Type":"ContainerStarted","Data":"bd1e7c563ea6665a05a01dfb3cd5c188624daab075cd1165d88e9b11c768e91a"} Feb 19 10:00:58 crc kubenswrapper[4873]: I0219 10:00:58.324984 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xw7xl"] Feb 19 10:00:58 crc kubenswrapper[4873]: I0219 10:00:58.325584 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xw7xl" podUID="8568b0bc-e3d1-4e4e-8172-bada186b750a" containerName="registry-server" containerID="cri-o://c66f926167963658837072f9619be2e502f49f78fdee38e1038232ea2b12e0fe" gracePeriod=2 Feb 19 10:00:58 crc kubenswrapper[4873]: I0219 10:00:58.862167 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sjwbx" podStartSLOduration=13.408298391 podStartE2EDuration="17.862141391s" podCreationTimestamp="2026-02-19 10:00:41 +0000 UTC" firstStartedPulling="2026-02-19 10:00:49.74904034 +0000 UTC m=+959.038471978" lastFinishedPulling="2026-02-19 10:00:54.20288334 +0000 UTC m=+963.492314978" observedRunningTime="2026-02-19 10:00:58.85924955 +0000 UTC m=+968.148681218" watchObservedRunningTime="2026-02-19 10:00:58.862141391 +0000 UTC m=+968.151573089" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.655934 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-767bbb56f-v5bpp"] Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.720017 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f5d85f6c-9z7rp"] Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.722833 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.738696 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f5d85f6c-9z7rp"] Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.746722 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-nb\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.746775 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrcpc\" (UniqueName: \"kubernetes.io/projected/710b77db-c69e-4428-93f6-7ce8b2c7ee17-kube-api-access-qrcpc\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.746800 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-dns-svc\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.746826 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-config\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.746899 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-sb\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.854051 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-sb\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.854907 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-sb\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.855040 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-nb\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.855070 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrcpc\" (UniqueName: \"kubernetes.io/projected/710b77db-c69e-4428-93f6-7ce8b2c7ee17-kube-api-access-qrcpc\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.855089 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-dns-svc\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.855189 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-config\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.855867 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-nb\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.856165 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-dns-svc\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.856294 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-config\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:00:59 crc kubenswrapper[4873]: I0219 10:00:59.898174 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrcpc\" (UniqueName: \"kubernetes.io/projected/710b77db-c69e-4428-93f6-7ce8b2c7ee17-kube-api-access-qrcpc\") pod \"dnsmasq-dns-7f5d85f6c-9z7rp\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.056768 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.758133 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.764266 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.766557 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.766693 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lrs6j" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.766884 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.767907 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.809153 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.868681 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b21a02-7162-42ca-84cf-e0fa36b04a22-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.868740 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gdhk\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-kube-api-access-5gdhk\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.868765 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.868903 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.869000 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c3b21a02-7162-42ca-84cf-e0fa36b04a22-cache\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.869031 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c3b21a02-7162-42ca-84cf-e0fa36b04a22-lock\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.869246 4873 generic.go:334] "Generic (PLEG): container finished" podID="8568b0bc-e3d1-4e4e-8172-bada186b750a" containerID="c66f926167963658837072f9619be2e502f49f78fdee38e1038232ea2b12e0fe" exitCode=0 Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.869290 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xw7xl" event={"ID":"8568b0bc-e3d1-4e4e-8172-bada186b750a","Type":"ContainerDied","Data":"c66f926167963658837072f9619be2e502f49f78fdee38e1038232ea2b12e0fe"} Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.969978 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.970047 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c3b21a02-7162-42ca-84cf-e0fa36b04a22-cache\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.970067 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c3b21a02-7162-42ca-84cf-e0fa36b04a22-lock\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.970094 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b21a02-7162-42ca-84cf-e0fa36b04a22-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.970135 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gdhk\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-kube-api-access-5gdhk\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.970149 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: E0219 10:01:00.970324 4873 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 10:01:00 crc kubenswrapper[4873]: E0219 10:01:00.970338 4873 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 10:01:00 crc kubenswrapper[4873]: E0219 10:01:00.970379 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift podName:c3b21a02-7162-42ca-84cf-e0fa36b04a22 nodeName:}" failed. No retries permitted until 2026-02-19 10:01:01.470364286 +0000 UTC m=+970.759795924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift") pod "swift-storage-0" (UID: "c3b21a02-7162-42ca-84cf-e0fa36b04a22") : configmap "swift-ring-files" not found Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.970797 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.971540 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c3b21a02-7162-42ca-84cf-e0fa36b04a22-cache\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.971970 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c3b21a02-7162-42ca-84cf-e0fa36b04a22-lock\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:00 crc kubenswrapper[4873]: I0219 10:01:00.978527 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b21a02-7162-42ca-84cf-e0fa36b04a22-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.054936 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gdhk\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-kube-api-access-5gdhk\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.059319 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.323791 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mx6qq"] Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.324829 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.326274 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.326855 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.327027 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.340543 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mx6qq"] Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.377202 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-swiftconf\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.377242 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-ring-data-devices\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.377304 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5klgc\" (UniqueName: \"kubernetes.io/projected/91fbca18-847d-4e7b-8a40-e52dd348d155-kube-api-access-5klgc\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.377332 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-combined-ca-bundle\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.377399 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-scripts\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.377468 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-dispersionconf\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.377526 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/91fbca18-847d-4e7b-8a40-e52dd348d155-etc-swift\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.478403 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-scripts\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.478679 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:01 crc kubenswrapper[4873]: E0219 10:01:01.478841 4873 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 10:01:01 crc kubenswrapper[4873]: E0219 10:01:01.478858 4873 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 10:01:01 crc kubenswrapper[4873]: E0219 10:01:01.478896 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift podName:c3b21a02-7162-42ca-84cf-e0fa36b04a22 nodeName:}" failed. No retries permitted until 2026-02-19 10:01:02.478880458 +0000 UTC m=+971.768312096 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift") pod "swift-storage-0" (UID: "c3b21a02-7162-42ca-84cf-e0fa36b04a22") : configmap "swift-ring-files" not found Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.479124 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-dispersionconf\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.479296 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-scripts\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.479484 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/91fbca18-847d-4e7b-8a40-e52dd348d155-etc-swift\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.479667 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-swiftconf\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.479808 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-ring-data-devices\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.479950 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5klgc\" (UniqueName: \"kubernetes.io/projected/91fbca18-847d-4e7b-8a40-e52dd348d155-kube-api-access-5klgc\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.480129 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-combined-ca-bundle\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.480801 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/91fbca18-847d-4e7b-8a40-e52dd348d155-etc-swift\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.480950 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-ring-data-devices\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.485436 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-dispersionconf\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.486139 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-combined-ca-bundle\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.498823 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-swiftconf\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.514754 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5klgc\" (UniqueName: \"kubernetes.io/projected/91fbca18-847d-4e7b-8a40-e52dd348d155-kube-api-access-5klgc\") pod \"swift-ring-rebalance-mx6qq\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:01 crc kubenswrapper[4873]: I0219 10:01:01.655204 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.079431 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.194151 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84czc\" (UniqueName: \"kubernetes.io/projected/8568b0bc-e3d1-4e4e-8172-bada186b750a-kube-api-access-84czc\") pod \"8568b0bc-e3d1-4e4e-8172-bada186b750a\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.194334 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-catalog-content\") pod \"8568b0bc-e3d1-4e4e-8172-bada186b750a\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.194456 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-utilities\") pod \"8568b0bc-e3d1-4e4e-8172-bada186b750a\" (UID: \"8568b0bc-e3d1-4e4e-8172-bada186b750a\") " Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.195504 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-utilities" (OuterVolumeSpecName: "utilities") pod "8568b0bc-e3d1-4e4e-8172-bada186b750a" (UID: "8568b0bc-e3d1-4e4e-8172-bada186b750a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.203479 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8568b0bc-e3d1-4e4e-8172-bada186b750a-kube-api-access-84czc" (OuterVolumeSpecName: "kube-api-access-84czc") pod "8568b0bc-e3d1-4e4e-8172-bada186b750a" (UID: "8568b0bc-e3d1-4e4e-8172-bada186b750a"). InnerVolumeSpecName "kube-api-access-84czc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.229611 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8568b0bc-e3d1-4e4e-8172-bada186b750a" (UID: "8568b0bc-e3d1-4e4e-8172-bada186b750a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.233432 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f5d85f6c-9z7rp"] Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.280931 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.280982 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.296472 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.296505 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84czc\" (UniqueName: \"kubernetes.io/projected/8568b0bc-e3d1-4e4e-8172-bada186b750a-kube-api-access-84czc\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.296519 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8568b0bc-e3d1-4e4e-8172-bada186b750a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.333112 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.499649 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:02 crc kubenswrapper[4873]: E0219 10:01:02.499918 4873 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 10:01:02 crc kubenswrapper[4873]: E0219 10:01:02.499940 4873 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 10:01:02 crc kubenswrapper[4873]: E0219 10:01:02.499995 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift podName:c3b21a02-7162-42ca-84cf-e0fa36b04a22 nodeName:}" failed. No retries permitted until 2026-02-19 10:01:04.499979331 +0000 UTC m=+973.789410969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift") pod "swift-storage-0" (UID: "c3b21a02-7162-42ca-84cf-e0fa36b04a22") : configmap "swift-ring-files" not found Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.907334 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" event={"ID":"710b77db-c69e-4428-93f6-7ce8b2c7ee17","Type":"ContainerStarted","Data":"d5213d8f776a516eb0ebc1bff77eadf707410bc2d3c6d133cd538660a60a385d"} Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.909436 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xw7xl" event={"ID":"8568b0bc-e3d1-4e4e-8172-bada186b750a","Type":"ContainerDied","Data":"a80672d41aa2943f5cef88b806f5ebd8b5afa23daa0fded90e827fd2b0faf463"} Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.909479 4873 scope.go:117] "RemoveContainer" containerID="c66f926167963658837072f9619be2e502f49f78fdee38e1038232ea2b12e0fe" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.909591 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xw7xl" Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.918233 4873 generic.go:334] "Generic (PLEG): container finished" podID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerID="7b07d5d4936c52da51ddc101b9f0cc93c881b6b0ec5f359e20ad961f5451f0a9" exitCode=0 Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.918306 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0b46b116-4858-4b6a-b3ad-9337272f9a91","Type":"ContainerDied","Data":"7b07d5d4936c52da51ddc101b9f0cc93c881b6b0ec5f359e20ad961f5451f0a9"} Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.975346 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xw7xl"] Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.982009 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xw7xl"] Feb 19 10:01:02 crc kubenswrapper[4873]: I0219 10:01:02.994131 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:01:03 crc kubenswrapper[4873]: I0219 10:01:03.493214 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8568b0bc-e3d1-4e4e-8172-bada186b750a" path="/var/lib/kubelet/pods/8568b0bc-e3d1-4e4e-8172-bada186b750a/volumes" Feb 19 10:01:03 crc kubenswrapper[4873]: I0219 10:01:03.888922 4873 scope.go:117] "RemoveContainer" containerID="8997664180cc4a89bc96536538fcab96c359ac683add2f689bc03e6ec23ed7ec" Feb 19 10:01:03 crc kubenswrapper[4873]: I0219 10:01:03.926558 4873 generic.go:334] "Generic (PLEG): container finished" podID="7d0de876-c87f-4760-b06f-87b8ff7e5588" containerID="556b5cd262209047f1b14207b0a637a7b665f1e1891d8c5b5396469110a2c80f" exitCode=0 Feb 19 10:01:03 crc kubenswrapper[4873]: I0219 10:01:03.926708 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" event={"ID":"7d0de876-c87f-4760-b06f-87b8ff7e5588","Type":"ContainerDied","Data":"556b5cd262209047f1b14207b0a637a7b665f1e1891d8c5b5396469110a2c80f"} Feb 19 10:01:03 crc kubenswrapper[4873]: I0219 10:01:03.933906 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hfhg" event={"ID":"f20bcc70-bf30-4949-951a-b36d083d205f","Type":"ContainerStarted","Data":"795aba3e60e31b3a457db244680452c8576c3d27a6b31cef59da782c97326eb3"} Feb 19 10:01:03 crc kubenswrapper[4873]: I0219 10:01:03.938631 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-djxfb" event={"ID":"888c3336-cd8a-4bf2-805f-6b473fb272f4","Type":"ContainerStarted","Data":"ce29cfdf182270bd176b751824b9cb006868ef939d551449b7f39e7d292a4e1d"} Feb 19 10:01:03 crc kubenswrapper[4873]: I0219 10:01:03.940430 4873 generic.go:334] "Generic (PLEG): container finished" podID="d4069c7b-b867-4c6b-b5dd-91529a59d01c" containerID="29f5ae38e1bf6410aa018f009d7b5755d54f5baf0b293e53dd34ae617aad0329" exitCode=0 Feb 19 10:01:03 crc kubenswrapper[4873]: I0219 10:01:03.940504 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" event={"ID":"d4069c7b-b867-4c6b-b5dd-91529a59d01c","Type":"ContainerDied","Data":"29f5ae38e1bf6410aa018f009d7b5755d54f5baf0b293e53dd34ae617aad0329"} Feb 19 10:01:03 crc kubenswrapper[4873]: I0219 10:01:03.977066 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9hfhg" podStartSLOduration=5.327064622 podStartE2EDuration="15.977050957s" podCreationTimestamp="2026-02-19 10:00:48 +0000 UTC" firstStartedPulling="2026-02-19 10:00:50.86856088 +0000 UTC m=+960.157992518" lastFinishedPulling="2026-02-19 10:01:01.518547195 +0000 UTC m=+970.807978853" observedRunningTime="2026-02-19 10:01:03.976118434 +0000 UTC m=+973.265550062" watchObservedRunningTime="2026-02-19 10:01:03.977050957 +0000 UTC m=+973.266482595" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.079882 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-djxfb" podStartSLOduration=11.079851507 podStartE2EDuration="11.079851507s" podCreationTimestamp="2026-02-19 10:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:01:04.068513356 +0000 UTC m=+973.357944994" watchObservedRunningTime="2026-02-19 10:01:04.079851507 +0000 UTC m=+973.369283175" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.409335 4873 scope.go:117] "RemoveContainer" containerID="dcfd39b1c5289cd4a8556f216e399630fd4927789a8b353b5566c5719ea3fcee" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.463583 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.556006 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkdz7\" (UniqueName: \"kubernetes.io/projected/7d0de876-c87f-4760-b06f-87b8ff7e5588-kube-api-access-mkdz7\") pod \"7d0de876-c87f-4760-b06f-87b8ff7e5588\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.556291 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:04 crc kubenswrapper[4873]: E0219 10:01:04.556572 4873 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 10:01:04 crc kubenswrapper[4873]: E0219 10:01:04.557773 4873 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 10:01:04 crc kubenswrapper[4873]: E0219 10:01:04.557861 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift podName:c3b21a02-7162-42ca-84cf-e0fa36b04a22 nodeName:}" failed. No retries permitted until 2026-02-19 10:01:08.557836776 +0000 UTC m=+977.847268464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift") pod "swift-storage-0" (UID: "c3b21a02-7162-42ca-84cf-e0fa36b04a22") : configmap "swift-ring-files" not found Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.563720 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0de876-c87f-4760-b06f-87b8ff7e5588-kube-api-access-mkdz7" (OuterVolumeSpecName: "kube-api-access-mkdz7") pod "7d0de876-c87f-4760-b06f-87b8ff7e5588" (UID: "7d0de876-c87f-4760-b06f-87b8ff7e5588"). InnerVolumeSpecName "kube-api-access-mkdz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.657080 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-config\") pod \"7d0de876-c87f-4760-b06f-87b8ff7e5588\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.657219 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-dns-svc\") pod \"7d0de876-c87f-4760-b06f-87b8ff7e5588\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.657288 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-ovsdbserver-sb\") pod \"7d0de876-c87f-4760-b06f-87b8ff7e5588\" (UID: \"7d0de876-c87f-4760-b06f-87b8ff7e5588\") " Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.657703 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkdz7\" (UniqueName: \"kubernetes.io/projected/7d0de876-c87f-4760-b06f-87b8ff7e5588-kube-api-access-mkdz7\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.679814 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7d0de876-c87f-4760-b06f-87b8ff7e5588" (UID: "7d0de876-c87f-4760-b06f-87b8ff7e5588"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.679894 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-config" (OuterVolumeSpecName: "config") pod "7d0de876-c87f-4760-b06f-87b8ff7e5588" (UID: "7d0de876-c87f-4760-b06f-87b8ff7e5588"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.685648 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d0de876-c87f-4760-b06f-87b8ff7e5588" (UID: "7d0de876-c87f-4760-b06f-87b8ff7e5588"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.760470 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.761397 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.761427 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.761440 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d0de876-c87f-4760-b06f-87b8ff7e5588-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.862476 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-dns-svc\") pod \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.862511 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-config\") pod \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.862680 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-nb\") pod \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.862745 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-sb\") pod \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.862794 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sqkb\" (UniqueName: \"kubernetes.io/projected/d4069c7b-b867-4c6b-b5dd-91529a59d01c-kube-api-access-6sqkb\") pod \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\" (UID: \"d4069c7b-b867-4c6b-b5dd-91529a59d01c\") " Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.869702 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4069c7b-b867-4c6b-b5dd-91529a59d01c-kube-api-access-6sqkb" (OuterVolumeSpecName: "kube-api-access-6sqkb") pod "d4069c7b-b867-4c6b-b5dd-91529a59d01c" (UID: "d4069c7b-b867-4c6b-b5dd-91529a59d01c"). InnerVolumeSpecName "kube-api-access-6sqkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.889626 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4069c7b-b867-4c6b-b5dd-91529a59d01c" (UID: "d4069c7b-b867-4c6b-b5dd-91529a59d01c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.899620 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d4069c7b-b867-4c6b-b5dd-91529a59d01c" (UID: "d4069c7b-b867-4c6b-b5dd-91529a59d01c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.899797 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d4069c7b-b867-4c6b-b5dd-91529a59d01c" (UID: "d4069c7b-b867-4c6b-b5dd-91529a59d01c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.899986 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-config" (OuterVolumeSpecName: "config") pod "d4069c7b-b867-4c6b-b5dd-91529a59d01c" (UID: "d4069c7b-b867-4c6b-b5dd-91529a59d01c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.965288 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sqkb\" (UniqueName: \"kubernetes.io/projected/d4069c7b-b867-4c6b-b5dd-91529a59d01c-kube-api-access-6sqkb\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.965936 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.965952 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.965965 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.965977 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4069c7b-b867-4c6b-b5dd-91529a59d01c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.970768 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" event={"ID":"d4069c7b-b867-4c6b-b5dd-91529a59d01c","Type":"ContainerDied","Data":"be25608adf6f38eb11a1d8fdb4fb6018bdff1784849dbf4c40f2c123fef01c50"} Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.970817 4873 scope.go:117] "RemoveContainer" containerID="29f5ae38e1bf6410aa018f009d7b5755d54f5baf0b293e53dd34ae617aad0329" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.970888 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-767bbb56f-v5bpp" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.986607 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" Feb 19 10:01:04 crc kubenswrapper[4873]: I0219 10:01:04.986991 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f8656d65-jtnp6" event={"ID":"7d0de876-c87f-4760-b06f-87b8ff7e5588","Type":"ContainerDied","Data":"27513f9246dbe172eded455fb97103ca558e42e17c318279c0c8d3c8528981cd"} Feb 19 10:01:05 crc kubenswrapper[4873]: I0219 10:01:05.048256 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-767bbb56f-v5bpp"] Feb 19 10:01:05 crc kubenswrapper[4873]: I0219 10:01:05.068584 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-767bbb56f-v5bpp"] Feb 19 10:01:05 crc kubenswrapper[4873]: I0219 10:01:05.079420 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f8656d65-jtnp6"] Feb 19 10:01:05 crc kubenswrapper[4873]: I0219 10:01:05.085564 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f8656d65-jtnp6"] Feb 19 10:01:05 crc kubenswrapper[4873]: I0219 10:01:05.239617 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mx6qq"] Feb 19 10:01:05 crc kubenswrapper[4873]: I0219 10:01:05.332299 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjwbx"] Feb 19 10:01:05 crc kubenswrapper[4873]: I0219 10:01:05.332523 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sjwbx" podUID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" containerName="registry-server" containerID="cri-o://e25bfdf167e531e407faddba4519081a8819c0f481dad1ec82bb12868d58279f" gracePeriod=2 Feb 19 10:01:05 crc kubenswrapper[4873]: I0219 10:01:05.497347 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d0de876-c87f-4760-b06f-87b8ff7e5588" path="/var/lib/kubelet/pods/7d0de876-c87f-4760-b06f-87b8ff7e5588/volumes" Feb 19 10:01:05 crc kubenswrapper[4873]: I0219 10:01:05.497837 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4069c7b-b867-4c6b-b5dd-91529a59d01c" path="/var/lib/kubelet/pods/d4069c7b-b867-4c6b-b5dd-91529a59d01c/volumes" Feb 19 10:01:05 crc kubenswrapper[4873]: I0219 10:01:05.801044 4873 scope.go:117] "RemoveContainer" containerID="556b5cd262209047f1b14207b0a637a7b665f1e1891d8c5b5396469110a2c80f" Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.023986 4873 generic.go:334] "Generic (PLEG): container finished" podID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" containerID="e25bfdf167e531e407faddba4519081a8819c0f481dad1ec82bb12868d58279f" exitCode=0 Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.024089 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjwbx" event={"ID":"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e","Type":"ContainerDied","Data":"e25bfdf167e531e407faddba4519081a8819c0f481dad1ec82bb12868d58279f"} Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.043037 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmkgp" event={"ID":"d115a791-c703-4c6e-91e5-8f3ab9608277","Type":"ContainerStarted","Data":"a571f36e40c7ad937de6c80c4fe2960d3a56ca51d705e8212a16e2333ee169bb"} Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.051523 4873 generic.go:334] "Generic (PLEG): container finished" podID="710b77db-c69e-4428-93f6-7ce8b2c7ee17" containerID="84d6805fa8e8d8d98a61caa72fabcd292a85d006dfab20b6192c2c1765a01aa4" exitCode=0 Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.051610 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" event={"ID":"710b77db-c69e-4428-93f6-7ce8b2c7ee17","Type":"ContainerDied","Data":"84d6805fa8e8d8d98a61caa72fabcd292a85d006dfab20b6192c2c1765a01aa4"} Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.067963 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mx6qq" event={"ID":"91fbca18-847d-4e7b-8a40-e52dd348d155","Type":"ContainerStarted","Data":"a4cfa9b70a12969456d4389138494314fd5620b64ab7eefa7cbad4c7f8f20a88"} Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.074383 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lmkgp" podStartSLOduration=14.010381125 podStartE2EDuration="29.074361499s" podCreationTimestamp="2026-02-19 10:00:37 +0000 UTC" firstStartedPulling="2026-02-19 10:00:49.731416566 +0000 UTC m=+959.020848204" lastFinishedPulling="2026-02-19 10:01:04.79539695 +0000 UTC m=+974.084828578" observedRunningTime="2026-02-19 10:01:06.068180285 +0000 UTC m=+975.357611923" watchObservedRunningTime="2026-02-19 10:01:06.074361499 +0000 UTC m=+975.363793137" Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.252234 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.324187 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-catalog-content\") pod \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.324490 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-utilities\") pod \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.324557 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v2hw\" (UniqueName: \"kubernetes.io/projected/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-kube-api-access-4v2hw\") pod \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\" (UID: \"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e\") " Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.325523 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-utilities" (OuterVolumeSpecName: "utilities") pod "dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" (UID: "dd6b83dc-5d8c-48f7-9e5e-9373c786f31e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.328692 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-kube-api-access-4v2hw" (OuterVolumeSpecName: "kube-api-access-4v2hw") pod "dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" (UID: "dd6b83dc-5d8c-48f7-9e5e-9373c786f31e"). InnerVolumeSpecName "kube-api-access-4v2hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.390485 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" (UID: "dd6b83dc-5d8c-48f7-9e5e-9373c786f31e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.427217 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.427259 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:06 crc kubenswrapper[4873]: I0219 10:01:06.427270 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v2hw\" (UniqueName: \"kubernetes.io/projected/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e-kube-api-access-4v2hw\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.084712 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" event={"ID":"710b77db-c69e-4428-93f6-7ce8b2c7ee17","Type":"ContainerStarted","Data":"9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97"} Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.085273 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.089016 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bd6df8e5-8bc5-4bd5-b466-a90642932cc2","Type":"ContainerStarted","Data":"0c596bb1b79046534852c1d9e3299a9e12cd3769e29292507fc91e6583bf59a7"} Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.089062 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bd6df8e5-8bc5-4bd5-b466-a90642932cc2","Type":"ContainerStarted","Data":"8e00278cfa2023995186da21c0a1d3e11321b27b525e575a4a20a1bd8589a082"} Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.089571 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.091293 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjwbx" event={"ID":"dd6b83dc-5d8c-48f7-9e5e-9373c786f31e","Type":"ContainerDied","Data":"6ceab5ac0f605f81c09fcddc9dd0cf16ee123c3875c132c11dd895633c1f969a"} Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.091335 4873 scope.go:117] "RemoveContainer" containerID="e25bfdf167e531e407faddba4519081a8819c0f481dad1ec82bb12868d58279f" Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.091434 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjwbx" Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.096141 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5224ec80-b354-467f-b660-2d22b9725be0","Type":"ContainerStarted","Data":"fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1"} Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.096570 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.111707 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" podStartSLOduration=8.111690194 podStartE2EDuration="8.111690194s" podCreationTimestamp="2026-02-19 10:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:01:07.10627477 +0000 UTC m=+976.395706398" watchObservedRunningTime="2026-02-19 10:01:07.111690194 +0000 UTC m=+976.401121832" Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.149209 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=6.836120793 podStartE2EDuration="13.149175564s" podCreationTimestamp="2026-02-19 10:00:54 +0000 UTC" firstStartedPulling="2026-02-19 10:00:59.53285835 +0000 UTC m=+968.822289988" lastFinishedPulling="2026-02-19 10:01:05.845913121 +0000 UTC m=+975.135344759" observedRunningTime="2026-02-19 10:01:07.144339374 +0000 UTC m=+976.433771032" watchObservedRunningTime="2026-02-19 10:01:07.149175564 +0000 UTC m=+976.438607202" Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.160809 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.51370348 podStartE2EDuration="48.160793752s" podCreationTimestamp="2026-02-19 10:00:19 +0000 UTC" firstStartedPulling="2026-02-19 10:00:28.160412476 +0000 UTC m=+937.449844114" lastFinishedPulling="2026-02-19 10:01:05.807502738 +0000 UTC m=+975.096934386" observedRunningTime="2026-02-19 10:01:07.159151512 +0000 UTC m=+976.448583170" watchObservedRunningTime="2026-02-19 10:01:07.160793752 +0000 UTC m=+976.450225390" Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.178637 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjwbx"] Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.184959 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sjwbx"] Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.494160 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" path="/var/lib/kubelet/pods/dd6b83dc-5d8c-48f7-9e5e-9373c786f31e/volumes" Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.903172 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:01:07 crc kubenswrapper[4873]: I0219 10:01:07.903242 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:01:08 crc kubenswrapper[4873]: I0219 10:01:08.567165 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:08 crc kubenswrapper[4873]: E0219 10:01:08.567353 4873 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 10:01:08 crc kubenswrapper[4873]: E0219 10:01:08.567534 4873 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 10:01:08 crc kubenswrapper[4873]: E0219 10:01:08.567589 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift podName:c3b21a02-7162-42ca-84cf-e0fa36b04a22 nodeName:}" failed. No retries permitted until 2026-02-19 10:01:16.567574254 +0000 UTC m=+985.857005882 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift") pod "swift-storage-0" (UID: "c3b21a02-7162-42ca-84cf-e0fa36b04a22") : configmap "swift-ring-files" not found Feb 19 10:01:08 crc kubenswrapper[4873]: I0219 10:01:08.745231 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:01:08 crc kubenswrapper[4873]: I0219 10:01:08.745382 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:01:08 crc kubenswrapper[4873]: I0219 10:01:08.805502 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:01:08 crc kubenswrapper[4873]: I0219 10:01:08.945089 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmkgp" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="registry-server" probeResult="failure" output=< Feb 19 10:01:08 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 10:01:08 crc kubenswrapper[4873]: > Feb 19 10:01:09 crc kubenswrapper[4873]: I0219 10:01:09.094031 4873 scope.go:117] "RemoveContainer" containerID="26c7c425cb063e2664738a405a0bc48114123eb086c08ef61f83247d0e893cd3" Feb 19 10:01:09 crc kubenswrapper[4873]: I0219 10:01:09.130505 4873 scope.go:117] "RemoveContainer" containerID="94b88ff2b105134857189f15d457bd06c0d3247317f372402ca302f0541cc41d" Feb 19 10:01:09 crc kubenswrapper[4873]: I0219 10:01:09.198323 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:01:10 crc kubenswrapper[4873]: I0219 10:01:10.160912 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mx6qq" event={"ID":"91fbca18-847d-4e7b-8a40-e52dd348d155","Type":"ContainerStarted","Data":"9dbc805ae65f008dd93910bebe60ab94566a15abae9fcb90aa3ea07d5a696df2"} Feb 19 10:01:10 crc kubenswrapper[4873]: I0219 10:01:10.182214 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mx6qq" podStartSLOduration=5.842911275 podStartE2EDuration="9.1821998s" podCreationTimestamp="2026-02-19 10:01:01 +0000 UTC" firstStartedPulling="2026-02-19 10:01:05.84184682 +0000 UTC m=+975.131278458" lastFinishedPulling="2026-02-19 10:01:09.181135345 +0000 UTC m=+978.470566983" observedRunningTime="2026-02-19 10:01:10.179988505 +0000 UTC m=+979.469420143" watchObservedRunningTime="2026-02-19 10:01:10.1821998 +0000 UTC m=+979.471631438" Feb 19 10:01:11 crc kubenswrapper[4873]: I0219 10:01:11.171083 4873 generic.go:334] "Generic (PLEG): container finished" podID="e3385c22-baa0-4261-b498-6a09c8768520" containerID="6e3e72cbe1386e2212554c075c732c997cd9eca5e8e80b367a0a82589ed0ceb4" exitCode=0 Feb 19 10:01:11 crc kubenswrapper[4873]: I0219 10:01:11.171152 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3385c22-baa0-4261-b498-6a09c8768520","Type":"ContainerDied","Data":"6e3e72cbe1386e2212554c075c732c997cd9eca5e8e80b367a0a82589ed0ceb4"} Feb 19 10:01:11 crc kubenswrapper[4873]: I0219 10:01:11.175206 4873 generic.go:334] "Generic (PLEG): container finished" podID="f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964" containerID="f86fddfaec0875e79580c640fa9abc158961398e84a30e8571f5d4f75941dd57" exitCode=0 Feb 19 10:01:11 crc kubenswrapper[4873]: I0219 10:01:11.175241 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964","Type":"ContainerDied","Data":"f86fddfaec0875e79580c640fa9abc158961398e84a30e8571f5d4f75941dd57"} Feb 19 10:01:11 crc kubenswrapper[4873]: I0219 10:01:11.724790 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hfhg"] Feb 19 10:01:12 crc kubenswrapper[4873]: I0219 10:01:12.191905 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3385c22-baa0-4261-b498-6a09c8768520","Type":"ContainerStarted","Data":"c0ff139ab7b9c15c5ce4902e4dba259d9eab3e681a00c3431e03eaf2cbeb34a2"} Feb 19 10:01:12 crc kubenswrapper[4873]: I0219 10:01:12.195655 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964","Type":"ContainerStarted","Data":"1e830e7fa894ecd68ee7b302c102fb509a703a04c9a8887c6a5ab35b125981fe"} Feb 19 10:01:12 crc kubenswrapper[4873]: I0219 10:01:12.196008 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9hfhg" podUID="f20bcc70-bf30-4949-951a-b36d083d205f" containerName="registry-server" containerID="cri-o://795aba3e60e31b3a457db244680452c8576c3d27a6b31cef59da782c97326eb3" gracePeriod=2 Feb 19 10:01:12 crc kubenswrapper[4873]: I0219 10:01:12.221736 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=49.308784756 podStartE2EDuration="57.221692059s" podCreationTimestamp="2026-02-19 10:00:15 +0000 UTC" firstStartedPulling="2026-02-19 10:00:28.160924038 +0000 UTC m=+937.450355676" lastFinishedPulling="2026-02-19 10:00:36.073831341 +0000 UTC m=+945.363262979" observedRunningTime="2026-02-19 10:01:12.215415353 +0000 UTC m=+981.504847001" watchObservedRunningTime="2026-02-19 10:01:12.221692059 +0000 UTC m=+981.511123697" Feb 19 10:01:12 crc kubenswrapper[4873]: I0219 10:01:12.244149 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=49.478754334 podStartE2EDuration="58.244129735s" podCreationTimestamp="2026-02-19 10:00:14 +0000 UTC" firstStartedPulling="2026-02-19 10:00:27.636275009 +0000 UTC m=+936.925706647" lastFinishedPulling="2026-02-19 10:00:36.40165041 +0000 UTC m=+945.691082048" observedRunningTime="2026-02-19 10:01:12.235312207 +0000 UTC m=+981.524743865" watchObservedRunningTime="2026-02-19 10:01:12.244129735 +0000 UTC m=+981.533561373" Feb 19 10:01:13 crc kubenswrapper[4873]: I0219 10:01:13.205970 4873 generic.go:334] "Generic (PLEG): container finished" podID="f20bcc70-bf30-4949-951a-b36d083d205f" containerID="795aba3e60e31b3a457db244680452c8576c3d27a6b31cef59da782c97326eb3" exitCode=0 Feb 19 10:01:13 crc kubenswrapper[4873]: I0219 10:01:13.206635 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hfhg" event={"ID":"f20bcc70-bf30-4949-951a-b36d083d205f","Type":"ContainerDied","Data":"795aba3e60e31b3a457db244680452c8576c3d27a6b31cef59da782c97326eb3"} Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.008868 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.183845 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-utilities\") pod \"f20bcc70-bf30-4949-951a-b36d083d205f\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.183990 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-772f4\" (UniqueName: \"kubernetes.io/projected/f20bcc70-bf30-4949-951a-b36d083d205f-kube-api-access-772f4\") pod \"f20bcc70-bf30-4949-951a-b36d083d205f\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.184046 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-catalog-content\") pod \"f20bcc70-bf30-4949-951a-b36d083d205f\" (UID: \"f20bcc70-bf30-4949-951a-b36d083d205f\") " Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.185331 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-utilities" (OuterVolumeSpecName: "utilities") pod "f20bcc70-bf30-4949-951a-b36d083d205f" (UID: "f20bcc70-bf30-4949-951a-b36d083d205f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.192283 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20bcc70-bf30-4949-951a-b36d083d205f-kube-api-access-772f4" (OuterVolumeSpecName: "kube-api-access-772f4") pod "f20bcc70-bf30-4949-951a-b36d083d205f" (UID: "f20bcc70-bf30-4949-951a-b36d083d205f"). InnerVolumeSpecName "kube-api-access-772f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.219951 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hfhg" event={"ID":"f20bcc70-bf30-4949-951a-b36d083d205f","Type":"ContainerDied","Data":"d368a20b1ccdc9da62de20626c013e559624ef41b56690236309ce9d1a2a14ac"} Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.219962 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hfhg" Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.221197 4873 scope.go:117] "RemoveContainer" containerID="795aba3e60e31b3a457db244680452c8576c3d27a6b31cef59da782c97326eb3" Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.222523 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0b46b116-4858-4b6a-b3ad-9337272f9a91","Type":"ContainerStarted","Data":"8c1b23b8ffccae7306c11a4c3d6415747d09b3586d46bdcc68e728a0936c32a0"} Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.245300 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f20bcc70-bf30-4949-951a-b36d083d205f" (UID: "f20bcc70-bf30-4949-951a-b36d083d205f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.245991 4873 scope.go:117] "RemoveContainer" containerID="35d25dc52ac46707a9ed600e3f35ef54061a263d12693fd1d59a80b0fdce1fe0" Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.270256 4873 scope.go:117] "RemoveContainer" containerID="eba7f3213f6b8b4ed8b7cfa357b6cb425beb34251cc11a43d090e87e4e2033e9" Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.286638 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.286672 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-772f4\" (UniqueName: \"kubernetes.io/projected/f20bcc70-bf30-4949-951a-b36d083d205f-kube-api-access-772f4\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.286684 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f20bcc70-bf30-4949-951a-b36d083d205f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.560709 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hfhg"] Feb 19 10:01:14 crc kubenswrapper[4873]: I0219 10:01:14.567089 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9hfhg"] Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.058412 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.118041 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58ff7f48c5-nqbz4"] Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.119603 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" podUID="d736e93a-6a36-458e-a8f4-a9d511530043" containerName="dnsmasq-dns" containerID="cri-o://7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750" gracePeriod=10 Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.499354 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20bcc70-bf30-4949-951a-b36d083d205f" path="/var/lib/kubelet/pods/f20bcc70-bf30-4949-951a-b36d083d205f/volumes" Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.676071 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.760278 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.760355 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.810236 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-dns-svc\") pod \"d736e93a-6a36-458e-a8f4-a9d511530043\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.810616 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5m2m\" (UniqueName: \"kubernetes.io/projected/d736e93a-6a36-458e-a8f4-a9d511530043-kube-api-access-s5m2m\") pod \"d736e93a-6a36-458e-a8f4-a9d511530043\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.810843 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-config\") pod \"d736e93a-6a36-458e-a8f4-a9d511530043\" (UID: \"d736e93a-6a36-458e-a8f4-a9d511530043\") " Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.817976 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d736e93a-6a36-458e-a8f4-a9d511530043-kube-api-access-s5m2m" (OuterVolumeSpecName: "kube-api-access-s5m2m") pod "d736e93a-6a36-458e-a8f4-a9d511530043" (UID: "d736e93a-6a36-458e-a8f4-a9d511530043"). InnerVolumeSpecName "kube-api-access-s5m2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.857173 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d736e93a-6a36-458e-a8f4-a9d511530043" (UID: "d736e93a-6a36-458e-a8f4-a9d511530043"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.857858 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-config" (OuterVolumeSpecName: "config") pod "d736e93a-6a36-458e-a8f4-a9d511530043" (UID: "d736e93a-6a36-458e-a8f4-a9d511530043"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.913396 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5m2m\" (UniqueName: \"kubernetes.io/projected/d736e93a-6a36-458e-a8f4-a9d511530043-kube-api-access-s5m2m\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.913428 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:15 crc kubenswrapper[4873]: I0219 10:01:15.913438 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d736e93a-6a36-458e-a8f4-a9d511530043-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.256306 4873 generic.go:334] "Generic (PLEG): container finished" podID="d736e93a-6a36-458e-a8f4-a9d511530043" containerID="7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750" exitCode=0 Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.256675 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" event={"ID":"d736e93a-6a36-458e-a8f4-a9d511530043","Type":"ContainerDied","Data":"7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750"} Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.256710 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" event={"ID":"d736e93a-6a36-458e-a8f4-a9d511530043","Type":"ContainerDied","Data":"9fb66c25e5ae993349fe01e14d2ddec63fb2fa9c9f8fdca8f66724a42a087dcf"} Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.256769 4873 scope.go:117] "RemoveContainer" containerID="7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750" Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.257003 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58ff7f48c5-nqbz4" Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.299444 4873 scope.go:117] "RemoveContainer" containerID="11d98b54d3d242cf39469162eef86399c7ac915ea837018032d2240dd76c8ed3" Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.359176 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58ff7f48c5-nqbz4"] Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.375719 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58ff7f48c5-nqbz4"] Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.396670 4873 scope.go:117] "RemoveContainer" containerID="7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750" Feb 19 10:01:16 crc kubenswrapper[4873]: E0219 10:01:16.400225 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750\": container with ID starting with 7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750 not found: ID does not exist" containerID="7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750" Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.400277 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750"} err="failed to get container status \"7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750\": rpc error: code = NotFound desc = could not find container \"7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750\": container with ID starting with 7de987cb52aa2fd800cbb7f97c53c2a6500299ae865b3cb3f8039ab8618c7750 not found: ID does not exist" Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.400304 4873 scope.go:117] "RemoveContainer" containerID="11d98b54d3d242cf39469162eef86399c7ac915ea837018032d2240dd76c8ed3" Feb 19 10:01:16 crc kubenswrapper[4873]: E0219 10:01:16.404241 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11d98b54d3d242cf39469162eef86399c7ac915ea837018032d2240dd76c8ed3\": container with ID starting with 11d98b54d3d242cf39469162eef86399c7ac915ea837018032d2240dd76c8ed3 not found: ID does not exist" containerID="11d98b54d3d242cf39469162eef86399c7ac915ea837018032d2240dd76c8ed3" Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.404281 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11d98b54d3d242cf39469162eef86399c7ac915ea837018032d2240dd76c8ed3"} err="failed to get container status \"11d98b54d3d242cf39469162eef86399c7ac915ea837018032d2240dd76c8ed3\": rpc error: code = NotFound desc = could not find container \"11d98b54d3d242cf39469162eef86399c7ac915ea837018032d2240dd76c8ed3\": container with ID starting with 11d98b54d3d242cf39469162eef86399c7ac915ea837018032d2240dd76c8ed3 not found: ID does not exist" Feb 19 10:01:16 crc kubenswrapper[4873]: I0219 10:01:16.626854 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:16 crc kubenswrapper[4873]: E0219 10:01:16.627701 4873 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 10:01:16 crc kubenswrapper[4873]: E0219 10:01:16.627738 4873 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 10:01:16 crc kubenswrapper[4873]: E0219 10:01:16.627820 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift podName:c3b21a02-7162-42ca-84cf-e0fa36b04a22 nodeName:}" failed. No retries permitted until 2026-02-19 10:01:32.62779836 +0000 UTC m=+1001.917230068 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift") pod "swift-storage-0" (UID: "c3b21a02-7162-42ca-84cf-e0fa36b04a22") : configmap "swift-ring-files" not found Feb 19 10:01:17 crc kubenswrapper[4873]: I0219 10:01:17.130142 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 10:01:17 crc kubenswrapper[4873]: I0219 10:01:17.131077 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 10:01:17 crc kubenswrapper[4873]: I0219 10:01:17.266653 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 10:01:17 crc kubenswrapper[4873]: I0219 10:01:17.270956 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0b46b116-4858-4b6a-b3ad-9337272f9a91","Type":"ContainerStarted","Data":"25a26e338e379f85c4c91e347569e9af0c97d68170a522acb020aee8d309e23c"} Feb 19 10:01:17 crc kubenswrapper[4873]: I0219 10:01:17.375605 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 10:01:17 crc kubenswrapper[4873]: I0219 10:01:17.498270 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d736e93a-6a36-458e-a8f4-a9d511530043" path="/var/lib/kubelet/pods/d736e93a-6a36-458e-a8f4-a9d511530043/volumes" Feb 19 10:01:17 crc kubenswrapper[4873]: I0219 10:01:17.823197 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vsnt5" podUID="b0ab9d21-0c11-4940-ad43-3e20c46012ad" containerName="ovn-controller" probeResult="failure" output=< Feb 19 10:01:17 crc kubenswrapper[4873]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 10:01:17 crc kubenswrapper[4873]: > Feb 19 10:01:18 crc kubenswrapper[4873]: I0219 10:01:18.281278 4873 generic.go:334] "Generic (PLEG): container finished" podID="91fbca18-847d-4e7b-8a40-e52dd348d155" containerID="9dbc805ae65f008dd93910bebe60ab94566a15abae9fcb90aa3ea07d5a696df2" exitCode=0 Feb 19 10:01:18 crc kubenswrapper[4873]: I0219 10:01:18.281373 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mx6qq" event={"ID":"91fbca18-847d-4e7b-8a40-e52dd348d155","Type":"ContainerDied","Data":"9dbc805ae65f008dd93910bebe60ab94566a15abae9fcb90aa3ea07d5a696df2"} Feb 19 10:01:18 crc kubenswrapper[4873]: I0219 10:01:18.959975 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmkgp" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="registry-server" probeResult="failure" output=< Feb 19 10:01:18 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 10:01:18 crc kubenswrapper[4873]: > Feb 19 10:01:19 crc kubenswrapper[4873]: I0219 10:01:19.593314 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 10:01:19 crc kubenswrapper[4873]: I0219 10:01:19.863666 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.015401 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.240213 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.302534 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mx6qq" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.302534 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mx6qq" event={"ID":"91fbca18-847d-4e7b-8a40-e52dd348d155","Type":"ContainerDied","Data":"a4cfa9b70a12969456d4389138494314fd5620b64ab7eefa7cbad4c7f8f20a88"} Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.302684 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4cfa9b70a12969456d4389138494314fd5620b64ab7eefa7cbad4c7f8f20a88" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.304870 4873 generic.go:334] "Generic (PLEG): container finished" podID="da89f0ff-c51c-4c4a-8df4-f7787d29ddd2" containerID="57567e0456fd2d45349518902ef81c44525ff9ae50ef5a0fa8ed4d2a66526532" exitCode=0 Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.304958 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2","Type":"ContainerDied","Data":"57567e0456fd2d45349518902ef81c44525ff9ae50ef5a0fa8ed4d2a66526532"} Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.309250 4873 generic.go:334] "Generic (PLEG): container finished" podID="9251ac9a-275e-4622-83a2-121d59ec8cd1" containerID="190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f" exitCode=0 Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.309295 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9251ac9a-275e-4622-83a2-121d59ec8cd1","Type":"ContainerDied","Data":"190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f"} Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.312299 4873 generic.go:334] "Generic (PLEG): container finished" podID="86685946-19ac-434a-974f-99b5beeda172" containerID="853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb" exitCode=0 Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.312350 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"86685946-19ac-434a-974f-99b5beeda172","Type":"ContainerDied","Data":"853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb"} Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.322275 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-combined-ca-bundle\") pod \"91fbca18-847d-4e7b-8a40-e52dd348d155\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.322323 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-swiftconf\") pod \"91fbca18-847d-4e7b-8a40-e52dd348d155\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.322352 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/91fbca18-847d-4e7b-8a40-e52dd348d155-etc-swift\") pod \"91fbca18-847d-4e7b-8a40-e52dd348d155\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.322375 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-scripts\") pod \"91fbca18-847d-4e7b-8a40-e52dd348d155\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.322396 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-ring-data-devices\") pod \"91fbca18-847d-4e7b-8a40-e52dd348d155\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.322498 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-dispersionconf\") pod \"91fbca18-847d-4e7b-8a40-e52dd348d155\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.322536 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5klgc\" (UniqueName: \"kubernetes.io/projected/91fbca18-847d-4e7b-8a40-e52dd348d155-kube-api-access-5klgc\") pod \"91fbca18-847d-4e7b-8a40-e52dd348d155\" (UID: \"91fbca18-847d-4e7b-8a40-e52dd348d155\") " Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.324118 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "91fbca18-847d-4e7b-8a40-e52dd348d155" (UID: "91fbca18-847d-4e7b-8a40-e52dd348d155"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.324946 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91fbca18-847d-4e7b-8a40-e52dd348d155-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "91fbca18-847d-4e7b-8a40-e52dd348d155" (UID: "91fbca18-847d-4e7b-8a40-e52dd348d155"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.329765 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91fbca18-847d-4e7b-8a40-e52dd348d155-kube-api-access-5klgc" (OuterVolumeSpecName: "kube-api-access-5klgc") pod "91fbca18-847d-4e7b-8a40-e52dd348d155" (UID: "91fbca18-847d-4e7b-8a40-e52dd348d155"). InnerVolumeSpecName "kube-api-access-5klgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.332592 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "91fbca18-847d-4e7b-8a40-e52dd348d155" (UID: "91fbca18-847d-4e7b-8a40-e52dd348d155"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.355852 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91fbca18-847d-4e7b-8a40-e52dd348d155" (UID: "91fbca18-847d-4e7b-8a40-e52dd348d155"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.376711 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-scripts" (OuterVolumeSpecName: "scripts") pod "91fbca18-847d-4e7b-8a40-e52dd348d155" (UID: "91fbca18-847d-4e7b-8a40-e52dd348d155"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.380590 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "91fbca18-847d-4e7b-8a40-e52dd348d155" (UID: "91fbca18-847d-4e7b-8a40-e52dd348d155"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.424624 4873 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.424890 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5klgc\" (UniqueName: \"kubernetes.io/projected/91fbca18-847d-4e7b-8a40-e52dd348d155-kube-api-access-5klgc\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.424985 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.425073 4873 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/91fbca18-847d-4e7b-8a40-e52dd348d155-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.425214 4873 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/91fbca18-847d-4e7b-8a40-e52dd348d155-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.426179 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:20 crc kubenswrapper[4873]: I0219 10:01:20.426280 4873 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/91fbca18-847d-4e7b-8a40-e52dd348d155-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:21 crc kubenswrapper[4873]: I0219 10:01:21.323075 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"86685946-19ac-434a-974f-99b5beeda172","Type":"ContainerStarted","Data":"aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72"} Feb 19 10:01:21 crc kubenswrapper[4873]: I0219 10:01:21.324132 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:01:21 crc kubenswrapper[4873]: I0219 10:01:21.325833 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"da89f0ff-c51c-4c4a-8df4-f7787d29ddd2","Type":"ContainerStarted","Data":"f2db44adca6cf6604ba2d059438ee7864c06927d8a0ee8e55a2e26392bb8b834"} Feb 19 10:01:21 crc kubenswrapper[4873]: I0219 10:01:21.326037 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:01:21 crc kubenswrapper[4873]: I0219 10:01:21.328977 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0b46b116-4858-4b6a-b3ad-9337272f9a91","Type":"ContainerStarted","Data":"7feaa15c2585dfd6c2a4ea45a4afeb2729895aad024065a30a27359792d333ac"} Feb 19 10:01:21 crc kubenswrapper[4873]: I0219 10:01:21.330904 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9251ac9a-275e-4622-83a2-121d59ec8cd1","Type":"ContainerStarted","Data":"bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9"} Feb 19 10:01:21 crc kubenswrapper[4873]: I0219 10:01:21.331171 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 10:01:21 crc kubenswrapper[4873]: I0219 10:01:21.351325 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=61.276451502 podStartE2EDuration="1m9.351307936s" podCreationTimestamp="2026-02-19 10:00:12 +0000 UTC" firstStartedPulling="2026-02-19 10:00:27.999327016 +0000 UTC m=+937.288758654" lastFinishedPulling="2026-02-19 10:00:36.07418344 +0000 UTC m=+945.363615088" observedRunningTime="2026-02-19 10:01:21.349793878 +0000 UTC m=+990.639225516" watchObservedRunningTime="2026-02-19 10:01:21.351307936 +0000 UTC m=+990.640739574" Feb 19 10:01:21 crc kubenswrapper[4873]: I0219 10:01:21.390088 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=10.390858734 podStartE2EDuration="1m2.390068657s" podCreationTimestamp="2026-02-19 10:00:19 +0000 UTC" firstStartedPulling="2026-02-19 10:00:28.386744923 +0000 UTC m=+937.676176561" lastFinishedPulling="2026-02-19 10:01:20.385954846 +0000 UTC m=+989.675386484" observedRunningTime="2026-02-19 10:01:21.385686349 +0000 UTC m=+990.675118007" watchObservedRunningTime="2026-02-19 10:01:21.390068657 +0000 UTC m=+990.679500295" Feb 19 10:01:21 crc kubenswrapper[4873]: I0219 10:01:21.427443 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=61.340230855 podStartE2EDuration="1m9.427422664s" podCreationTimestamp="2026-02-19 10:00:12 +0000 UTC" firstStartedPulling="2026-02-19 10:00:27.036805875 +0000 UTC m=+936.326237513" lastFinishedPulling="2026-02-19 10:00:35.123997684 +0000 UTC m=+944.413429322" observedRunningTime="2026-02-19 10:01:21.420058621 +0000 UTC m=+990.709490279" watchObservedRunningTime="2026-02-19 10:01:21.427422664 +0000 UTC m=+990.716854312" Feb 19 10:01:21 crc kubenswrapper[4873]: I0219 10:01:21.447628 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/notifications-rabbitmq-server-0" podStartSLOduration=61.040081322 podStartE2EDuration="1m9.447602195s" podCreationTimestamp="2026-02-19 10:00:12 +0000 UTC" firstStartedPulling="2026-02-19 10:00:27.66757226 +0000 UTC m=+936.957003898" lastFinishedPulling="2026-02-19 10:00:36.075093133 +0000 UTC m=+945.364524771" observedRunningTime="2026-02-19 10:01:21.442643462 +0000 UTC m=+990.732075100" watchObservedRunningTime="2026-02-19 10:01:21.447602195 +0000 UTC m=+990.737033833" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.487294 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c73a-account-create-update-zxxrn"] Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.488209 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" containerName="extract-utilities" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.488280 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" containerName="extract-utilities" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.488356 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0de876-c87f-4760-b06f-87b8ff7e5588" containerName="init" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.488418 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0de876-c87f-4760-b06f-87b8ff7e5588" containerName="init" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.488480 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20bcc70-bf30-4949-951a-b36d083d205f" containerName="extract-content" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.488537 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20bcc70-bf30-4949-951a-b36d083d205f" containerName="extract-content" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.488597 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fbca18-847d-4e7b-8a40-e52dd348d155" containerName="swift-ring-rebalance" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.488648 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fbca18-847d-4e7b-8a40-e52dd348d155" containerName="swift-ring-rebalance" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.488705 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d736e93a-6a36-458e-a8f4-a9d511530043" containerName="init" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.488763 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d736e93a-6a36-458e-a8f4-a9d511530043" containerName="init" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.488830 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d736e93a-6a36-458e-a8f4-a9d511530043" containerName="dnsmasq-dns" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.488905 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d736e93a-6a36-458e-a8f4-a9d511530043" containerName="dnsmasq-dns" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.488986 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" containerName="extract-content" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.489048 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" containerName="extract-content" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.489135 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8568b0bc-e3d1-4e4e-8172-bada186b750a" containerName="extract-utilities" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.489197 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8568b0bc-e3d1-4e4e-8172-bada186b750a" containerName="extract-utilities" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.489267 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4069c7b-b867-4c6b-b5dd-91529a59d01c" containerName="init" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.489318 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4069c7b-b867-4c6b-b5dd-91529a59d01c" containerName="init" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.489382 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8568b0bc-e3d1-4e4e-8172-bada186b750a" containerName="registry-server" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.489446 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8568b0bc-e3d1-4e4e-8172-bada186b750a" containerName="registry-server" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.489506 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8568b0bc-e3d1-4e4e-8172-bada186b750a" containerName="extract-content" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.489560 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8568b0bc-e3d1-4e4e-8172-bada186b750a" containerName="extract-content" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.489626 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20bcc70-bf30-4949-951a-b36d083d205f" containerName="extract-utilities" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.489703 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20bcc70-bf30-4949-951a-b36d083d205f" containerName="extract-utilities" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.489798 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" containerName="registry-server" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.489858 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" containerName="registry-server" Feb 19 10:01:22 crc kubenswrapper[4873]: E0219 10:01:22.489931 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20bcc70-bf30-4949-951a-b36d083d205f" containerName="registry-server" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.489985 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20bcc70-bf30-4949-951a-b36d083d205f" containerName="registry-server" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.490214 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fbca18-847d-4e7b-8a40-e52dd348d155" containerName="swift-ring-rebalance" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.490287 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="8568b0bc-e3d1-4e4e-8172-bada186b750a" containerName="registry-server" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.490347 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d736e93a-6a36-458e-a8f4-a9d511530043" containerName="dnsmasq-dns" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.490408 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20bcc70-bf30-4949-951a-b36d083d205f" containerName="registry-server" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.490468 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d0de876-c87f-4760-b06f-87b8ff7e5588" containerName="init" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.490530 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6b83dc-5d8c-48f7-9e5e-9373c786f31e" containerName="registry-server" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.490595 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4069c7b-b867-4c6b-b5dd-91529a59d01c" containerName="init" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.492842 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c73a-account-create-update-zxxrn" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.500533 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.504455 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c73a-account-create-update-zxxrn"] Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.536539 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-p55tt"] Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.537522 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p55tt" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.553954 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-p55tt"] Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.664211 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-operator-scripts\") pod \"glance-c73a-account-create-update-zxxrn\" (UID: \"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88\") " pod="openstack/glance-c73a-account-create-update-zxxrn" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.664282 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d81f72af-8420-4334-811e-f0e0cc1c7731-operator-scripts\") pod \"glance-db-create-p55tt\" (UID: \"d81f72af-8420-4334-811e-f0e0cc1c7731\") " pod="openstack/glance-db-create-p55tt" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.664317 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbm47\" (UniqueName: \"kubernetes.io/projected/d81f72af-8420-4334-811e-f0e0cc1c7731-kube-api-access-fbm47\") pod \"glance-db-create-p55tt\" (UID: \"d81f72af-8420-4334-811e-f0e0cc1c7731\") " pod="openstack/glance-db-create-p55tt" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.664353 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6hvh\" (UniqueName: \"kubernetes.io/projected/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-kube-api-access-h6hvh\") pod \"glance-c73a-account-create-update-zxxrn\" (UID: \"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88\") " pod="openstack/glance-c73a-account-create-update-zxxrn" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.766024 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbm47\" (UniqueName: \"kubernetes.io/projected/d81f72af-8420-4334-811e-f0e0cc1c7731-kube-api-access-fbm47\") pod \"glance-db-create-p55tt\" (UID: \"d81f72af-8420-4334-811e-f0e0cc1c7731\") " pod="openstack/glance-db-create-p55tt" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.766459 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6hvh\" (UniqueName: \"kubernetes.io/projected/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-kube-api-access-h6hvh\") pod \"glance-c73a-account-create-update-zxxrn\" (UID: \"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88\") " pod="openstack/glance-c73a-account-create-update-zxxrn" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.766703 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-operator-scripts\") pod \"glance-c73a-account-create-update-zxxrn\" (UID: \"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88\") " pod="openstack/glance-c73a-account-create-update-zxxrn" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.767334 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-operator-scripts\") pod \"glance-c73a-account-create-update-zxxrn\" (UID: \"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88\") " pod="openstack/glance-c73a-account-create-update-zxxrn" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.767394 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d81f72af-8420-4334-811e-f0e0cc1c7731-operator-scripts\") pod \"glance-db-create-p55tt\" (UID: \"d81f72af-8420-4334-811e-f0e0cc1c7731\") " pod="openstack/glance-db-create-p55tt" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.767929 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d81f72af-8420-4334-811e-f0e0cc1c7731-operator-scripts\") pod \"glance-db-create-p55tt\" (UID: \"d81f72af-8420-4334-811e-f0e0cc1c7731\") " pod="openstack/glance-db-create-p55tt" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.784598 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbm47\" (UniqueName: \"kubernetes.io/projected/d81f72af-8420-4334-811e-f0e0cc1c7731-kube-api-access-fbm47\") pod \"glance-db-create-p55tt\" (UID: \"d81f72af-8420-4334-811e-f0e0cc1c7731\") " pod="openstack/glance-db-create-p55tt" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.791646 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6hvh\" (UniqueName: \"kubernetes.io/projected/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-kube-api-access-h6hvh\") pod \"glance-c73a-account-create-update-zxxrn\" (UID: \"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88\") " pod="openstack/glance-c73a-account-create-update-zxxrn" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.811951 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vsnt5" podUID="b0ab9d21-0c11-4940-ad43-3e20c46012ad" containerName="ovn-controller" probeResult="failure" output=< Feb 19 10:01:22 crc kubenswrapper[4873]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 10:01:22 crc kubenswrapper[4873]: > Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.816092 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c73a-account-create-update-zxxrn" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.841220 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.841272 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-t5bgp" Feb 19 10:01:22 crc kubenswrapper[4873]: I0219 10:01:22.856862 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p55tt" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.061463 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vsnt5-config-mj2kr"] Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.063629 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.069925 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vsnt5-config-mj2kr"] Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.069949 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.165150 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-p55tt"] Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.175144 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-scripts\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.175288 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-additional-scripts\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.175331 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.175385 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsbdg\" (UniqueName: \"kubernetes.io/projected/b6410809-f775-4bf8-bc41-63f159854e76-kube-api-access-lsbdg\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.175409 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run-ovn\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.175483 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-log-ovn\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.276845 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-scripts\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.276943 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-additional-scripts\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.276973 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.276993 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsbdg\" (UniqueName: \"kubernetes.io/projected/b6410809-f775-4bf8-bc41-63f159854e76-kube-api-access-lsbdg\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.277012 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run-ovn\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.277057 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-log-ovn\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.277361 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-log-ovn\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.277410 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run-ovn\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.277519 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.277842 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-additional-scripts\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.279319 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-scripts\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.298160 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsbdg\" (UniqueName: \"kubernetes.io/projected/b6410809-f775-4bf8-bc41-63f159854e76-kube-api-access-lsbdg\") pod \"ovn-controller-vsnt5-config-mj2kr\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.358052 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p55tt" event={"ID":"d81f72af-8420-4334-811e-f0e0cc1c7731","Type":"ContainerStarted","Data":"1dee66111cb00be8e1d7b10f7d4c7537cb2d5b855ce8c1d8f116d504503e7207"} Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.390115 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.434250 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c73a-account-create-update-zxxrn"] Feb 19 10:01:23 crc kubenswrapper[4873]: W0219 10:01:23.440825 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd0e3e74_f1aa_4b5f_a2ae_b89f90644f88.slice/crio-737a12bc43c5a907f8cb10150d1c5a92a48abbaf8d51eba39a18020cba56938f WatchSource:0}: Error finding container 737a12bc43c5a907f8cb10150d1c5a92a48abbaf8d51eba39a18020cba56938f: Status 404 returned error can't find the container with id 737a12bc43c5a907f8cb10150d1c5a92a48abbaf8d51eba39a18020cba56938f Feb 19 10:01:23 crc kubenswrapper[4873]: I0219 10:01:23.867660 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vsnt5-config-mj2kr"] Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.340945 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-d9nmp"] Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.342596 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d9nmp" Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.344853 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.350427 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d9nmp"] Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.371803 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsnt5-config-mj2kr" event={"ID":"b6410809-f775-4bf8-bc41-63f159854e76","Type":"ContainerStarted","Data":"0abe3568e1e754df8609a1e142601381c885a8173a04bd7acf2a44dd4c765ac5"} Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.372725 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c73a-account-create-update-zxxrn" event={"ID":"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88","Type":"ContainerStarted","Data":"737a12bc43c5a907f8cb10150d1c5a92a48abbaf8d51eba39a18020cba56938f"} Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.482141 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.495970 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqt7l\" (UniqueName: \"kubernetes.io/projected/ad0c7de4-412a-4e8c-90d0-817151c8a015-kube-api-access-pqt7l\") pod \"root-account-create-update-d9nmp\" (UID: \"ad0c7de4-412a-4e8c-90d0-817151c8a015\") " pod="openstack/root-account-create-update-d9nmp" Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.496027 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad0c7de4-412a-4e8c-90d0-817151c8a015-operator-scripts\") pod \"root-account-create-update-d9nmp\" (UID: \"ad0c7de4-412a-4e8c-90d0-817151c8a015\") " pod="openstack/root-account-create-update-d9nmp" Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.597917 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqt7l\" (UniqueName: \"kubernetes.io/projected/ad0c7de4-412a-4e8c-90d0-817151c8a015-kube-api-access-pqt7l\") pod \"root-account-create-update-d9nmp\" (UID: \"ad0c7de4-412a-4e8c-90d0-817151c8a015\") " pod="openstack/root-account-create-update-d9nmp" Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.598003 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad0c7de4-412a-4e8c-90d0-817151c8a015-operator-scripts\") pod \"root-account-create-update-d9nmp\" (UID: \"ad0c7de4-412a-4e8c-90d0-817151c8a015\") " pod="openstack/root-account-create-update-d9nmp" Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.599701 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad0c7de4-412a-4e8c-90d0-817151c8a015-operator-scripts\") pod \"root-account-create-update-d9nmp\" (UID: \"ad0c7de4-412a-4e8c-90d0-817151c8a015\") " pod="openstack/root-account-create-update-d9nmp" Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.638943 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqt7l\" (UniqueName: \"kubernetes.io/projected/ad0c7de4-412a-4e8c-90d0-817151c8a015-kube-api-access-pqt7l\") pod \"root-account-create-update-d9nmp\" (UID: \"ad0c7de4-412a-4e8c-90d0-817151c8a015\") " pod="openstack/root-account-create-update-d9nmp" Feb 19 10:01:24 crc kubenswrapper[4873]: I0219 10:01:24.656427 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d9nmp" Feb 19 10:01:25 crc kubenswrapper[4873]: I0219 10:01:25.136937 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d9nmp"] Feb 19 10:01:25 crc kubenswrapper[4873]: I0219 10:01:25.381125 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p55tt" event={"ID":"d81f72af-8420-4334-811e-f0e0cc1c7731","Type":"ContainerStarted","Data":"7cdae30c7bd3b7746068a98e9abd29f90395d96b144f2365d0ebb1da465756e0"} Feb 19 10:01:25 crc kubenswrapper[4873]: I0219 10:01:25.382030 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d9nmp" event={"ID":"ad0c7de4-412a-4e8c-90d0-817151c8a015","Type":"ContainerStarted","Data":"b1bc87ff69bfb1fa797a7dfd6ebfbe51b81cc642abb258a9974b216355561af2"} Feb 19 10:01:25 crc kubenswrapper[4873]: I0219 10:01:25.383414 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c73a-account-create-update-zxxrn" event={"ID":"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88","Type":"ContainerStarted","Data":"5b7d076de566fc4d5772c9116560231f2acaebd0aad62281f0cb88965b142cc2"} Feb 19 10:01:25 crc kubenswrapper[4873]: I0219 10:01:25.403265 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-p55tt" podStartSLOduration=3.40324313 podStartE2EDuration="3.40324313s" podCreationTimestamp="2026-02-19 10:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:01:25.398882352 +0000 UTC m=+994.688313990" watchObservedRunningTime="2026-02-19 10:01:25.40324313 +0000 UTC m=+994.692674768" Feb 19 10:01:25 crc kubenswrapper[4873]: I0219 10:01:25.863966 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:26 crc kubenswrapper[4873]: I0219 10:01:26.393176 4873 generic.go:334] "Generic (PLEG): container finished" podID="ad0c7de4-412a-4e8c-90d0-817151c8a015" containerID="2f0ffc7ea2219fb39042b2ae636be2bc871ede3a5af5f5056178cf8abfebcb4d" exitCode=0 Feb 19 10:01:26 crc kubenswrapper[4873]: I0219 10:01:26.393245 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d9nmp" event={"ID":"ad0c7de4-412a-4e8c-90d0-817151c8a015","Type":"ContainerDied","Data":"2f0ffc7ea2219fb39042b2ae636be2bc871ede3a5af5f5056178cf8abfebcb4d"} Feb 19 10:01:26 crc kubenswrapper[4873]: I0219 10:01:26.395334 4873 generic.go:334] "Generic (PLEG): container finished" podID="fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88" containerID="5b7d076de566fc4d5772c9116560231f2acaebd0aad62281f0cb88965b142cc2" exitCode=0 Feb 19 10:01:26 crc kubenswrapper[4873]: I0219 10:01:26.395406 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c73a-account-create-update-zxxrn" event={"ID":"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88","Type":"ContainerDied","Data":"5b7d076de566fc4d5772c9116560231f2acaebd0aad62281f0cb88965b142cc2"} Feb 19 10:01:26 crc kubenswrapper[4873]: I0219 10:01:26.400022 4873 generic.go:334] "Generic (PLEG): container finished" podID="d81f72af-8420-4334-811e-f0e0cc1c7731" containerID="7cdae30c7bd3b7746068a98e9abd29f90395d96b144f2365d0ebb1da465756e0" exitCode=0 Feb 19 10:01:26 crc kubenswrapper[4873]: I0219 10:01:26.400090 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p55tt" event={"ID":"d81f72af-8420-4334-811e-f0e0cc1c7731","Type":"ContainerDied","Data":"7cdae30c7bd3b7746068a98e9abd29f90395d96b144f2365d0ebb1da465756e0"} Feb 19 10:01:26 crc kubenswrapper[4873]: I0219 10:01:26.402180 4873 generic.go:334] "Generic (PLEG): container finished" podID="b6410809-f775-4bf8-bc41-63f159854e76" containerID="300c17fe87cdc74fea5cc1a915ff92db53e3c3a4eee6ced7352b06833035dffb" exitCode=0 Feb 19 10:01:26 crc kubenswrapper[4873]: I0219 10:01:26.402233 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsnt5-config-mj2kr" event={"ID":"b6410809-f775-4bf8-bc41-63f159854e76","Type":"ContainerDied","Data":"300c17fe87cdc74fea5cc1a915ff92db53e3c3a4eee6ced7352b06833035dffb"} Feb 19 10:01:27 crc kubenswrapper[4873]: I0219 10:01:27.828792 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c73a-account-create-update-zxxrn" Feb 19 10:01:27 crc kubenswrapper[4873]: I0219 10:01:27.837600 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vsnt5" Feb 19 10:01:27 crc kubenswrapper[4873]: I0219 10:01:27.957769 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6hvh\" (UniqueName: \"kubernetes.io/projected/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-kube-api-access-h6hvh\") pod \"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88\" (UID: \"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88\") " Feb 19 10:01:27 crc kubenswrapper[4873]: I0219 10:01:27.957826 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-operator-scripts\") pod \"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88\" (UID: \"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88\") " Feb 19 10:01:27 crc kubenswrapper[4873]: I0219 10:01:27.959900 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88" (UID: "fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:27 crc kubenswrapper[4873]: I0219 10:01:27.966209 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-kube-api-access-h6hvh" (OuterVolumeSpecName: "kube-api-access-h6hvh") pod "fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88" (UID: "fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88"). InnerVolumeSpecName "kube-api-access-h6hvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.042144 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d9nmp" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.049972 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.059953 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p55tt" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.060021 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6hvh\" (UniqueName: \"kubernetes.io/projected/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-kube-api-access-h6hvh\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.060043 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.160732 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-log-ovn\") pod \"b6410809-f775-4bf8-bc41-63f159854e76\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.160804 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqt7l\" (UniqueName: \"kubernetes.io/projected/ad0c7de4-412a-4e8c-90d0-817151c8a015-kube-api-access-pqt7l\") pod \"ad0c7de4-412a-4e8c-90d0-817151c8a015\" (UID: \"ad0c7de4-412a-4e8c-90d0-817151c8a015\") " Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.160840 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-additional-scripts\") pod \"b6410809-f775-4bf8-bc41-63f159854e76\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.160887 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run\") pod \"b6410809-f775-4bf8-bc41-63f159854e76\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.160912 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-scripts\") pod \"b6410809-f775-4bf8-bc41-63f159854e76\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.160902 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b6410809-f775-4bf8-bc41-63f159854e76" (UID: "b6410809-f775-4bf8-bc41-63f159854e76"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.160945 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad0c7de4-412a-4e8c-90d0-817151c8a015-operator-scripts\") pod \"ad0c7de4-412a-4e8c-90d0-817151c8a015\" (UID: \"ad0c7de4-412a-4e8c-90d0-817151c8a015\") " Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.160970 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run-ovn\") pod \"b6410809-f775-4bf8-bc41-63f159854e76\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.161019 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbm47\" (UniqueName: \"kubernetes.io/projected/d81f72af-8420-4334-811e-f0e0cc1c7731-kube-api-access-fbm47\") pod \"d81f72af-8420-4334-811e-f0e0cc1c7731\" (UID: \"d81f72af-8420-4334-811e-f0e0cc1c7731\") " Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.161046 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsbdg\" (UniqueName: \"kubernetes.io/projected/b6410809-f775-4bf8-bc41-63f159854e76-kube-api-access-lsbdg\") pod \"b6410809-f775-4bf8-bc41-63f159854e76\" (UID: \"b6410809-f775-4bf8-bc41-63f159854e76\") " Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.161129 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d81f72af-8420-4334-811e-f0e0cc1c7731-operator-scripts\") pod \"d81f72af-8420-4334-811e-f0e0cc1c7731\" (UID: \"d81f72af-8420-4334-811e-f0e0cc1c7731\") " Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.161659 4873 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.160968 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run" (OuterVolumeSpecName: "var-run") pod "b6410809-f775-4bf8-bc41-63f159854e76" (UID: "b6410809-f775-4bf8-bc41-63f159854e76"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.161437 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b6410809-f775-4bf8-bc41-63f159854e76" (UID: "b6410809-f775-4bf8-bc41-63f159854e76"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.161906 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b6410809-f775-4bf8-bc41-63f159854e76" (UID: "b6410809-f775-4bf8-bc41-63f159854e76"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.162114 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-scripts" (OuterVolumeSpecName: "scripts") pod "b6410809-f775-4bf8-bc41-63f159854e76" (UID: "b6410809-f775-4bf8-bc41-63f159854e76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.162242 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad0c7de4-412a-4e8c-90d0-817151c8a015-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad0c7de4-412a-4e8c-90d0-817151c8a015" (UID: "ad0c7de4-412a-4e8c-90d0-817151c8a015"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.162251 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d81f72af-8420-4334-811e-f0e0cc1c7731-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d81f72af-8420-4334-811e-f0e0cc1c7731" (UID: "d81f72af-8420-4334-811e-f0e0cc1c7731"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.163709 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0c7de4-412a-4e8c-90d0-817151c8a015-kube-api-access-pqt7l" (OuterVolumeSpecName: "kube-api-access-pqt7l") pod "ad0c7de4-412a-4e8c-90d0-817151c8a015" (UID: "ad0c7de4-412a-4e8c-90d0-817151c8a015"). InnerVolumeSpecName "kube-api-access-pqt7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.166582 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6410809-f775-4bf8-bc41-63f159854e76-kube-api-access-lsbdg" (OuterVolumeSpecName: "kube-api-access-lsbdg") pod "b6410809-f775-4bf8-bc41-63f159854e76" (UID: "b6410809-f775-4bf8-bc41-63f159854e76"). InnerVolumeSpecName "kube-api-access-lsbdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.169056 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d81f72af-8420-4334-811e-f0e0cc1c7731-kube-api-access-fbm47" (OuterVolumeSpecName: "kube-api-access-fbm47") pod "d81f72af-8420-4334-811e-f0e0cc1c7731" (UID: "d81f72af-8420-4334-811e-f0e0cc1c7731"). InnerVolumeSpecName "kube-api-access-fbm47". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.262607 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqt7l\" (UniqueName: \"kubernetes.io/projected/ad0c7de4-412a-4e8c-90d0-817151c8a015-kube-api-access-pqt7l\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.262637 4873 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.262646 4873 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.262656 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6410809-f775-4bf8-bc41-63f159854e76-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.262666 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad0c7de4-412a-4e8c-90d0-817151c8a015-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.262674 4873 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6410809-f775-4bf8-bc41-63f159854e76-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.262684 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbm47\" (UniqueName: \"kubernetes.io/projected/d81f72af-8420-4334-811e-f0e0cc1c7731-kube-api-access-fbm47\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.262693 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsbdg\" (UniqueName: \"kubernetes.io/projected/b6410809-f775-4bf8-bc41-63f159854e76-kube-api-access-lsbdg\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.262702 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d81f72af-8420-4334-811e-f0e0cc1c7731-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.302394 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-r4fbt"] Feb 19 10:01:28 crc kubenswrapper[4873]: E0219 10:01:28.302739 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad0c7de4-412a-4e8c-90d0-817151c8a015" containerName="mariadb-account-create-update" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.302755 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0c7de4-412a-4e8c-90d0-817151c8a015" containerName="mariadb-account-create-update" Feb 19 10:01:28 crc kubenswrapper[4873]: E0219 10:01:28.302777 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d81f72af-8420-4334-811e-f0e0cc1c7731" containerName="mariadb-database-create" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.302784 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81f72af-8420-4334-811e-f0e0cc1c7731" containerName="mariadb-database-create" Feb 19 10:01:28 crc kubenswrapper[4873]: E0219 10:01:28.302793 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6410809-f775-4bf8-bc41-63f159854e76" containerName="ovn-config" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.302799 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6410809-f775-4bf8-bc41-63f159854e76" containerName="ovn-config" Feb 19 10:01:28 crc kubenswrapper[4873]: E0219 10:01:28.302807 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88" containerName="mariadb-account-create-update" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.302814 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88" containerName="mariadb-account-create-update" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.302960 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad0c7de4-412a-4e8c-90d0-817151c8a015" containerName="mariadb-account-create-update" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.302975 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6410809-f775-4bf8-bc41-63f159854e76" containerName="ovn-config" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.302984 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88" containerName="mariadb-account-create-update" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.302993 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d81f72af-8420-4334-811e-f0e0cc1c7731" containerName="mariadb-database-create" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.303523 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r4fbt" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.315360 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-r4fbt"] Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.403616 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-e187-account-create-update-4xb7l"] Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.404553 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e187-account-create-update-4xb7l" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.406789 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.421751 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e187-account-create-update-4xb7l"] Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.422164 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c73a-account-create-update-zxxrn" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.422229 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c73a-account-create-update-zxxrn" event={"ID":"fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88","Type":"ContainerDied","Data":"737a12bc43c5a907f8cb10150d1c5a92a48abbaf8d51eba39a18020cba56938f"} Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.422265 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="737a12bc43c5a907f8cb10150d1c5a92a48abbaf8d51eba39a18020cba56938f" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.427606 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p55tt" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.427810 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p55tt" event={"ID":"d81f72af-8420-4334-811e-f0e0cc1c7731","Type":"ContainerDied","Data":"1dee66111cb00be8e1d7b10f7d4c7537cb2d5b855ce8c1d8f116d504503e7207"} Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.427864 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dee66111cb00be8e1d7b10f7d4c7537cb2d5b855ce8c1d8f116d504503e7207" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.429700 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vsnt5-config-mj2kr" event={"ID":"b6410809-f775-4bf8-bc41-63f159854e76","Type":"ContainerDied","Data":"0abe3568e1e754df8609a1e142601381c885a8173a04bd7acf2a44dd4c765ac5"} Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.429740 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0abe3568e1e754df8609a1e142601381c885a8173a04bd7acf2a44dd4c765ac5" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.429807 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vsnt5-config-mj2kr" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.436882 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d9nmp" event={"ID":"ad0c7de4-412a-4e8c-90d0-817151c8a015","Type":"ContainerDied","Data":"b1bc87ff69bfb1fa797a7dfd6ebfbe51b81cc642abb258a9974b216355561af2"} Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.436920 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1bc87ff69bfb1fa797a7dfd6ebfbe51b81cc642abb258a9974b216355561af2" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.436993 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d9nmp" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.465349 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1770246-951b-40da-a0a2-4320dde71437-operator-scripts\") pod \"keystone-db-create-r4fbt\" (UID: \"a1770246-951b-40da-a0a2-4320dde71437\") " pod="openstack/keystone-db-create-r4fbt" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.465488 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgfcv\" (UniqueName: \"kubernetes.io/projected/a1770246-951b-40da-a0a2-4320dde71437-kube-api-access-xgfcv\") pod \"keystone-db-create-r4fbt\" (UID: \"a1770246-951b-40da-a0a2-4320dde71437\") " pod="openstack/keystone-db-create-r4fbt" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.497717 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-46kds"] Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.498854 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-46kds" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.506089 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-46kds"] Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.567020 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1770246-951b-40da-a0a2-4320dde71437-operator-scripts\") pod \"keystone-db-create-r4fbt\" (UID: \"a1770246-951b-40da-a0a2-4320dde71437\") " pod="openstack/keystone-db-create-r4fbt" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.567131 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4hpt\" (UniqueName: \"kubernetes.io/projected/af085fbb-9aaa-4d01-8a0f-a061acf3a845-kube-api-access-g4hpt\") pod \"keystone-e187-account-create-update-4xb7l\" (UID: \"af085fbb-9aaa-4d01-8a0f-a061acf3a845\") " pod="openstack/keystone-e187-account-create-update-4xb7l" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.567186 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgfcv\" (UniqueName: \"kubernetes.io/projected/a1770246-951b-40da-a0a2-4320dde71437-kube-api-access-xgfcv\") pod \"keystone-db-create-r4fbt\" (UID: \"a1770246-951b-40da-a0a2-4320dde71437\") " pod="openstack/keystone-db-create-r4fbt" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.567226 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af085fbb-9aaa-4d01-8a0f-a061acf3a845-operator-scripts\") pod \"keystone-e187-account-create-update-4xb7l\" (UID: \"af085fbb-9aaa-4d01-8a0f-a061acf3a845\") " pod="openstack/keystone-e187-account-create-update-4xb7l" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.568281 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1770246-951b-40da-a0a2-4320dde71437-operator-scripts\") pod \"keystone-db-create-r4fbt\" (UID: \"a1770246-951b-40da-a0a2-4320dde71437\") " pod="openstack/keystone-db-create-r4fbt" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.595529 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgfcv\" (UniqueName: \"kubernetes.io/projected/a1770246-951b-40da-a0a2-4320dde71437-kube-api-access-xgfcv\") pod \"keystone-db-create-r4fbt\" (UID: \"a1770246-951b-40da-a0a2-4320dde71437\") " pod="openstack/keystone-db-create-r4fbt" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.607689 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f064-account-create-update-flh2f"] Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.608728 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f064-account-create-update-flh2f" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.611674 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.621018 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f064-account-create-update-flh2f"] Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.624269 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r4fbt" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.668765 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4hpt\" (UniqueName: \"kubernetes.io/projected/af085fbb-9aaa-4d01-8a0f-a061acf3a845-kube-api-access-g4hpt\") pod \"keystone-e187-account-create-update-4xb7l\" (UID: \"af085fbb-9aaa-4d01-8a0f-a061acf3a845\") " pod="openstack/keystone-e187-account-create-update-4xb7l" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.668815 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6584bab0-12c6-4bce-99be-d38f3748f896-operator-scripts\") pod \"placement-db-create-46kds\" (UID: \"6584bab0-12c6-4bce-99be-d38f3748f896\") " pod="openstack/placement-db-create-46kds" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.668883 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p9lk\" (UniqueName: \"kubernetes.io/projected/6584bab0-12c6-4bce-99be-d38f3748f896-kube-api-access-5p9lk\") pod \"placement-db-create-46kds\" (UID: \"6584bab0-12c6-4bce-99be-d38f3748f896\") " pod="openstack/placement-db-create-46kds" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.668916 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af085fbb-9aaa-4d01-8a0f-a061acf3a845-operator-scripts\") pod \"keystone-e187-account-create-update-4xb7l\" (UID: \"af085fbb-9aaa-4d01-8a0f-a061acf3a845\") " pod="openstack/keystone-e187-account-create-update-4xb7l" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.669571 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af085fbb-9aaa-4d01-8a0f-a061acf3a845-operator-scripts\") pod \"keystone-e187-account-create-update-4xb7l\" (UID: \"af085fbb-9aaa-4d01-8a0f-a061acf3a845\") " pod="openstack/keystone-e187-account-create-update-4xb7l" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.686738 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4hpt\" (UniqueName: \"kubernetes.io/projected/af085fbb-9aaa-4d01-8a0f-a061acf3a845-kube-api-access-g4hpt\") pod \"keystone-e187-account-create-update-4xb7l\" (UID: \"af085fbb-9aaa-4d01-8a0f-a061acf3a845\") " pod="openstack/keystone-e187-account-create-update-4xb7l" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.723370 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e187-account-create-update-4xb7l" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.770753 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p9lk\" (UniqueName: \"kubernetes.io/projected/6584bab0-12c6-4bce-99be-d38f3748f896-kube-api-access-5p9lk\") pod \"placement-db-create-46kds\" (UID: \"6584bab0-12c6-4bce-99be-d38f3748f896\") " pod="openstack/placement-db-create-46kds" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.770836 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/179cf76d-a15d-4bce-be42-18ad2e4abb94-operator-scripts\") pod \"placement-f064-account-create-update-flh2f\" (UID: \"179cf76d-a15d-4bce-be42-18ad2e4abb94\") " pod="openstack/placement-f064-account-create-update-flh2f" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.770958 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2sgk\" (UniqueName: \"kubernetes.io/projected/179cf76d-a15d-4bce-be42-18ad2e4abb94-kube-api-access-p2sgk\") pod \"placement-f064-account-create-update-flh2f\" (UID: \"179cf76d-a15d-4bce-be42-18ad2e4abb94\") " pod="openstack/placement-f064-account-create-update-flh2f" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.770985 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6584bab0-12c6-4bce-99be-d38f3748f896-operator-scripts\") pod \"placement-db-create-46kds\" (UID: \"6584bab0-12c6-4bce-99be-d38f3748f896\") " pod="openstack/placement-db-create-46kds" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.771877 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6584bab0-12c6-4bce-99be-d38f3748f896-operator-scripts\") pod \"placement-db-create-46kds\" (UID: \"6584bab0-12c6-4bce-99be-d38f3748f896\") " pod="openstack/placement-db-create-46kds" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.796082 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p9lk\" (UniqueName: \"kubernetes.io/projected/6584bab0-12c6-4bce-99be-d38f3748f896-kube-api-access-5p9lk\") pod \"placement-db-create-46kds\" (UID: \"6584bab0-12c6-4bce-99be-d38f3748f896\") " pod="openstack/placement-db-create-46kds" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.818663 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-46kds" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.874995 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/179cf76d-a15d-4bce-be42-18ad2e4abb94-operator-scripts\") pod \"placement-f064-account-create-update-flh2f\" (UID: \"179cf76d-a15d-4bce-be42-18ad2e4abb94\") " pod="openstack/placement-f064-account-create-update-flh2f" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.875146 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2sgk\" (UniqueName: \"kubernetes.io/projected/179cf76d-a15d-4bce-be42-18ad2e4abb94-kube-api-access-p2sgk\") pod \"placement-f064-account-create-update-flh2f\" (UID: \"179cf76d-a15d-4bce-be42-18ad2e4abb94\") " pod="openstack/placement-f064-account-create-update-flh2f" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.875984 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/179cf76d-a15d-4bce-be42-18ad2e4abb94-operator-scripts\") pod \"placement-f064-account-create-update-flh2f\" (UID: \"179cf76d-a15d-4bce-be42-18ad2e4abb94\") " pod="openstack/placement-f064-account-create-update-flh2f" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.896765 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2sgk\" (UniqueName: \"kubernetes.io/projected/179cf76d-a15d-4bce-be42-18ad2e4abb94-kube-api-access-p2sgk\") pod \"placement-f064-account-create-update-flh2f\" (UID: \"179cf76d-a15d-4bce-be42-18ad2e4abb94\") " pod="openstack/placement-f064-account-create-update-flh2f" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.930763 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f064-account-create-update-flh2f" Feb 19 10:01:28 crc kubenswrapper[4873]: I0219 10:01:28.948340 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmkgp" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="registry-server" probeResult="failure" output=< Feb 19 10:01:28 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 10:01:28 crc kubenswrapper[4873]: > Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.192351 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-r4fbt"] Feb 19 10:01:29 crc kubenswrapper[4873]: W0219 10:01:29.199073 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1770246_951b_40da_a0a2_4320dde71437.slice/crio-c00ae3a8c5b91d1638f081e9179c87fa9241c86e2f6597e4bda03d4a8763efb2 WatchSource:0}: Error finding container c00ae3a8c5b91d1638f081e9179c87fa9241c86e2f6597e4bda03d4a8763efb2: Status 404 returned error can't find the container with id c00ae3a8c5b91d1638f081e9179c87fa9241c86e2f6597e4bda03d4a8763efb2 Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.320140 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e187-account-create-update-4xb7l"] Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.348948 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vsnt5-config-mj2kr"] Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.363326 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vsnt5-config-mj2kr"] Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.466954 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r4fbt" event={"ID":"a1770246-951b-40da-a0a2-4320dde71437","Type":"ContainerStarted","Data":"c00ae3a8c5b91d1638f081e9179c87fa9241c86e2f6597e4bda03d4a8763efb2"} Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.481916 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e187-account-create-update-4xb7l" event={"ID":"af085fbb-9aaa-4d01-8a0f-a061acf3a845","Type":"ContainerStarted","Data":"182e7c202e54319407360794cf227f99f4d9087f026461b7d72de7a24ada750c"} Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.482876 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-46kds"] Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.514114 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6410809-f775-4bf8-bc41-63f159854e76" path="/var/lib/kubelet/pods/b6410809-f775-4bf8-bc41-63f159854e76/volumes" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.517495 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-46kds" event={"ID":"6584bab0-12c6-4bce-99be-d38f3748f896","Type":"ContainerStarted","Data":"8c9bc379e5f12ad6cc9fc4a894a6c0e8ca43c8a837f75aefd0d3827ab77c8833"} Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.537494 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f064-account-create-update-flh2f"] Feb 19 10:01:29 crc kubenswrapper[4873]: W0219 10:01:29.553076 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179cf76d_a15d_4bce_be42_18ad2e4abb94.slice/crio-32cc1e04ecdfcb01e465840967113b749d5b3dd117345a4527603f742b0f0727 WatchSource:0}: Error finding container 32cc1e04ecdfcb01e465840967113b749d5b3dd117345a4527603f742b0f0727: Status 404 returned error can't find the container with id 32cc1e04ecdfcb01e465840967113b749d5b3dd117345a4527603f742b0f0727 Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.620089 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-dftzh"] Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.621829 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dftzh" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.630557 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-dftzh"] Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.718086 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-087c-account-create-update-qnlsx"] Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.719156 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-087c-account-create-update-qnlsx" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.721483 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.729811 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-087c-account-create-update-qnlsx"] Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.799510 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-operator-scripts\") pod \"watcher-087c-account-create-update-qnlsx\" (UID: \"7b0cc2ef-89a2-4220-8b44-7fc71537ab50\") " pod="openstack/watcher-087c-account-create-update-qnlsx" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.799681 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2hmj\" (UniqueName: \"kubernetes.io/projected/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-kube-api-access-q2hmj\") pod \"watcher-db-create-dftzh\" (UID: \"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4\") " pod="openstack/watcher-db-create-dftzh" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.799878 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-operator-scripts\") pod \"watcher-db-create-dftzh\" (UID: \"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4\") " pod="openstack/watcher-db-create-dftzh" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.799993 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsvsx\" (UniqueName: \"kubernetes.io/projected/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-kube-api-access-nsvsx\") pod \"watcher-087c-account-create-update-qnlsx\" (UID: \"7b0cc2ef-89a2-4220-8b44-7fc71537ab50\") " pod="openstack/watcher-087c-account-create-update-qnlsx" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.901554 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-operator-scripts\") pod \"watcher-db-create-dftzh\" (UID: \"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4\") " pod="openstack/watcher-db-create-dftzh" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.901651 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsvsx\" (UniqueName: \"kubernetes.io/projected/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-kube-api-access-nsvsx\") pod \"watcher-087c-account-create-update-qnlsx\" (UID: \"7b0cc2ef-89a2-4220-8b44-7fc71537ab50\") " pod="openstack/watcher-087c-account-create-update-qnlsx" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.901741 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-operator-scripts\") pod \"watcher-087c-account-create-update-qnlsx\" (UID: \"7b0cc2ef-89a2-4220-8b44-7fc71537ab50\") " pod="openstack/watcher-087c-account-create-update-qnlsx" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.901794 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2hmj\" (UniqueName: \"kubernetes.io/projected/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-kube-api-access-q2hmj\") pod \"watcher-db-create-dftzh\" (UID: \"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4\") " pod="openstack/watcher-db-create-dftzh" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.902679 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-operator-scripts\") pod \"watcher-db-create-dftzh\" (UID: \"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4\") " pod="openstack/watcher-db-create-dftzh" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.902803 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-operator-scripts\") pod \"watcher-087c-account-create-update-qnlsx\" (UID: \"7b0cc2ef-89a2-4220-8b44-7fc71537ab50\") " pod="openstack/watcher-087c-account-create-update-qnlsx" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.924165 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsvsx\" (UniqueName: \"kubernetes.io/projected/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-kube-api-access-nsvsx\") pod \"watcher-087c-account-create-update-qnlsx\" (UID: \"7b0cc2ef-89a2-4220-8b44-7fc71537ab50\") " pod="openstack/watcher-087c-account-create-update-qnlsx" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.924443 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2hmj\" (UniqueName: \"kubernetes.io/projected/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-kube-api-access-q2hmj\") pod \"watcher-db-create-dftzh\" (UID: \"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4\") " pod="openstack/watcher-db-create-dftzh" Feb 19 10:01:29 crc kubenswrapper[4873]: I0219 10:01:29.954562 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dftzh" Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.075256 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-087c-account-create-update-qnlsx" Feb 19 10:01:30 crc kubenswrapper[4873]: W0219 10:01:30.460855 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd8b7b2e_f4a8_4af9_99aa_a1e8c3d78bd4.slice/crio-3eb30dc7ce6a38d18c43fa6669a1fe45b71d1e88c10fa39a798d51dbbdc8535a WatchSource:0}: Error finding container 3eb30dc7ce6a38d18c43fa6669a1fe45b71d1e88c10fa39a798d51dbbdc8535a: Status 404 returned error can't find the container with id 3eb30dc7ce6a38d18c43fa6669a1fe45b71d1e88c10fa39a798d51dbbdc8535a Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.479021 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-dftzh"] Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.509416 4873 generic.go:334] "Generic (PLEG): container finished" podID="6584bab0-12c6-4bce-99be-d38f3748f896" containerID="2be1eaacedf333b387e3ffd6dce5223b73f9487c48808cd68df4b60a3f55fd39" exitCode=0 Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.509498 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-46kds" event={"ID":"6584bab0-12c6-4bce-99be-d38f3748f896","Type":"ContainerDied","Data":"2be1eaacedf333b387e3ffd6dce5223b73f9487c48808cd68df4b60a3f55fd39"} Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.511192 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-dftzh" event={"ID":"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4","Type":"ContainerStarted","Data":"3eb30dc7ce6a38d18c43fa6669a1fe45b71d1e88c10fa39a798d51dbbdc8535a"} Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.513139 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r4fbt" event={"ID":"a1770246-951b-40da-a0a2-4320dde71437","Type":"ContainerDied","Data":"3a7ee324cc97736a2be2ff10cda880e991b9ebce5c06108335c9156379f7a8ea"} Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.513211 4873 generic.go:334] "Generic (PLEG): container finished" podID="a1770246-951b-40da-a0a2-4320dde71437" containerID="3a7ee324cc97736a2be2ff10cda880e991b9ebce5c06108335c9156379f7a8ea" exitCode=0 Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.518946 4873 generic.go:334] "Generic (PLEG): container finished" podID="179cf76d-a15d-4bce-be42-18ad2e4abb94" containerID="7d6d1faa851ee46aca753c0c6509416782269dc982725ceddb2cd7f19fc16f13" exitCode=0 Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.519025 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f064-account-create-update-flh2f" event={"ID":"179cf76d-a15d-4bce-be42-18ad2e4abb94","Type":"ContainerDied","Data":"7d6d1faa851ee46aca753c0c6509416782269dc982725ceddb2cd7f19fc16f13"} Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.519056 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f064-account-create-update-flh2f" event={"ID":"179cf76d-a15d-4bce-be42-18ad2e4abb94","Type":"ContainerStarted","Data":"32cc1e04ecdfcb01e465840967113b749d5b3dd117345a4527603f742b0f0727"} Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.522143 4873 generic.go:334] "Generic (PLEG): container finished" podID="af085fbb-9aaa-4d01-8a0f-a061acf3a845" containerID="d63b34383441b0e539673e31cf4ea017f3d4fcdbd72ad26d47bf96c33fcf565d" exitCode=0 Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.522210 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e187-account-create-update-4xb7l" event={"ID":"af085fbb-9aaa-4d01-8a0f-a061acf3a845","Type":"ContainerDied","Data":"d63b34383441b0e539673e31cf4ea017f3d4fcdbd72ad26d47bf96c33fcf565d"} Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.565577 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-087c-account-create-update-qnlsx"] Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.812800 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-d9nmp"] Feb 19 10:01:30 crc kubenswrapper[4873]: I0219 10:01:30.820160 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-d9nmp"] Feb 19 10:01:31 crc kubenswrapper[4873]: I0219 10:01:31.502070 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad0c7de4-412a-4e8c-90d0-817151c8a015" path="/var/lib/kubelet/pods/ad0c7de4-412a-4e8c-90d0-817151c8a015/volumes" Feb 19 10:01:31 crc kubenswrapper[4873]: I0219 10:01:31.532266 4873 generic.go:334] "Generic (PLEG): container finished" podID="7b0cc2ef-89a2-4220-8b44-7fc71537ab50" containerID="f0d3f3ad8d69a092fbacd08190bbe079ce8644eec25f2003bbee9cc3d511dd9c" exitCode=0 Feb 19 10:01:31 crc kubenswrapper[4873]: I0219 10:01:31.532350 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-087c-account-create-update-qnlsx" event={"ID":"7b0cc2ef-89a2-4220-8b44-7fc71537ab50","Type":"ContainerDied","Data":"f0d3f3ad8d69a092fbacd08190bbe079ce8644eec25f2003bbee9cc3d511dd9c"} Feb 19 10:01:31 crc kubenswrapper[4873]: I0219 10:01:31.532402 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-087c-account-create-update-qnlsx" event={"ID":"7b0cc2ef-89a2-4220-8b44-7fc71537ab50","Type":"ContainerStarted","Data":"3721d608d71f840cfefd584322332fd88b72dcb989cdcfcd288c6da01a5126d0"} Feb 19 10:01:31 crc kubenswrapper[4873]: I0219 10:01:31.534167 4873 generic.go:334] "Generic (PLEG): container finished" podID="bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4" containerID="6a370733fa679d2517624889ae788a6c37c512bf2894dbe6a54f6e24bdad6056" exitCode=0 Feb 19 10:01:31 crc kubenswrapper[4873]: I0219 10:01:31.534214 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-dftzh" event={"ID":"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4","Type":"ContainerDied","Data":"6a370733fa679d2517624889ae788a6c37c512bf2894dbe6a54f6e24bdad6056"} Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.105826 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f064-account-create-update-flh2f" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.112060 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-46kds" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.117534 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r4fbt" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.125384 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e187-account-create-update-4xb7l" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.263393 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/179cf76d-a15d-4bce-be42-18ad2e4abb94-operator-scripts\") pod \"179cf76d-a15d-4bce-be42-18ad2e4abb94\" (UID: \"179cf76d-a15d-4bce-be42-18ad2e4abb94\") " Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.263447 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2sgk\" (UniqueName: \"kubernetes.io/projected/179cf76d-a15d-4bce-be42-18ad2e4abb94-kube-api-access-p2sgk\") pod \"179cf76d-a15d-4bce-be42-18ad2e4abb94\" (UID: \"179cf76d-a15d-4bce-be42-18ad2e4abb94\") " Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.263511 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4hpt\" (UniqueName: \"kubernetes.io/projected/af085fbb-9aaa-4d01-8a0f-a061acf3a845-kube-api-access-g4hpt\") pod \"af085fbb-9aaa-4d01-8a0f-a061acf3a845\" (UID: \"af085fbb-9aaa-4d01-8a0f-a061acf3a845\") " Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.263553 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgfcv\" (UniqueName: \"kubernetes.io/projected/a1770246-951b-40da-a0a2-4320dde71437-kube-api-access-xgfcv\") pod \"a1770246-951b-40da-a0a2-4320dde71437\" (UID: \"a1770246-951b-40da-a0a2-4320dde71437\") " Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.263579 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6584bab0-12c6-4bce-99be-d38f3748f896-operator-scripts\") pod \"6584bab0-12c6-4bce-99be-d38f3748f896\" (UID: \"6584bab0-12c6-4bce-99be-d38f3748f896\") " Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.263648 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af085fbb-9aaa-4d01-8a0f-a061acf3a845-operator-scripts\") pod \"af085fbb-9aaa-4d01-8a0f-a061acf3a845\" (UID: \"af085fbb-9aaa-4d01-8a0f-a061acf3a845\") " Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.263675 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1770246-951b-40da-a0a2-4320dde71437-operator-scripts\") pod \"a1770246-951b-40da-a0a2-4320dde71437\" (UID: \"a1770246-951b-40da-a0a2-4320dde71437\") " Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.263733 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p9lk\" (UniqueName: \"kubernetes.io/projected/6584bab0-12c6-4bce-99be-d38f3748f896-kube-api-access-5p9lk\") pod \"6584bab0-12c6-4bce-99be-d38f3748f896\" (UID: \"6584bab0-12c6-4bce-99be-d38f3748f896\") " Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.264331 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/179cf76d-a15d-4bce-be42-18ad2e4abb94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "179cf76d-a15d-4bce-be42-18ad2e4abb94" (UID: "179cf76d-a15d-4bce-be42-18ad2e4abb94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.264516 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af085fbb-9aaa-4d01-8a0f-a061acf3a845-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af085fbb-9aaa-4d01-8a0f-a061acf3a845" (UID: "af085fbb-9aaa-4d01-8a0f-a061acf3a845"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.264540 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1770246-951b-40da-a0a2-4320dde71437-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1770246-951b-40da-a0a2-4320dde71437" (UID: "a1770246-951b-40da-a0a2-4320dde71437"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.264873 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6584bab0-12c6-4bce-99be-d38f3748f896-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6584bab0-12c6-4bce-99be-d38f3748f896" (UID: "6584bab0-12c6-4bce-99be-d38f3748f896"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.270005 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6584bab0-12c6-4bce-99be-d38f3748f896-kube-api-access-5p9lk" (OuterVolumeSpecName: "kube-api-access-5p9lk") pod "6584bab0-12c6-4bce-99be-d38f3748f896" (UID: "6584bab0-12c6-4bce-99be-d38f3748f896"). InnerVolumeSpecName "kube-api-access-5p9lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.270434 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1770246-951b-40da-a0a2-4320dde71437-kube-api-access-xgfcv" (OuterVolumeSpecName: "kube-api-access-xgfcv") pod "a1770246-951b-40da-a0a2-4320dde71437" (UID: "a1770246-951b-40da-a0a2-4320dde71437"). InnerVolumeSpecName "kube-api-access-xgfcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.272340 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af085fbb-9aaa-4d01-8a0f-a061acf3a845-kube-api-access-g4hpt" (OuterVolumeSpecName: "kube-api-access-g4hpt") pod "af085fbb-9aaa-4d01-8a0f-a061acf3a845" (UID: "af085fbb-9aaa-4d01-8a0f-a061acf3a845"). InnerVolumeSpecName "kube-api-access-g4hpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.276362 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/179cf76d-a15d-4bce-be42-18ad2e4abb94-kube-api-access-p2sgk" (OuterVolumeSpecName: "kube-api-access-p2sgk") pod "179cf76d-a15d-4bce-be42-18ad2e4abb94" (UID: "179cf76d-a15d-4bce-be42-18ad2e4abb94"). InnerVolumeSpecName "kube-api-access-p2sgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.365807 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/179cf76d-a15d-4bce-be42-18ad2e4abb94-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.365850 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2sgk\" (UniqueName: \"kubernetes.io/projected/179cf76d-a15d-4bce-be42-18ad2e4abb94-kube-api-access-p2sgk\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.365864 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4hpt\" (UniqueName: \"kubernetes.io/projected/af085fbb-9aaa-4d01-8a0f-a061acf3a845-kube-api-access-g4hpt\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.365872 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgfcv\" (UniqueName: \"kubernetes.io/projected/a1770246-951b-40da-a0a2-4320dde71437-kube-api-access-xgfcv\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.365883 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6584bab0-12c6-4bce-99be-d38f3748f896-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.365894 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af085fbb-9aaa-4d01-8a0f-a061acf3a845-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.365904 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1770246-951b-40da-a0a2-4320dde71437-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.365915 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p9lk\" (UniqueName: \"kubernetes.io/projected/6584bab0-12c6-4bce-99be-d38f3748f896-kube-api-access-5p9lk\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.544062 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r4fbt" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.544112 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r4fbt" event={"ID":"a1770246-951b-40da-a0a2-4320dde71437","Type":"ContainerDied","Data":"c00ae3a8c5b91d1638f081e9179c87fa9241c86e2f6597e4bda03d4a8763efb2"} Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.544153 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00ae3a8c5b91d1638f081e9179c87fa9241c86e2f6597e4bda03d4a8763efb2" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.546206 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f064-account-create-update-flh2f" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.546202 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f064-account-create-update-flh2f" event={"ID":"179cf76d-a15d-4bce-be42-18ad2e4abb94","Type":"ContainerDied","Data":"32cc1e04ecdfcb01e465840967113b749d5b3dd117345a4527603f742b0f0727"} Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.546388 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32cc1e04ecdfcb01e465840967113b749d5b3dd117345a4527603f742b0f0727" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.548601 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e187-account-create-update-4xb7l" event={"ID":"af085fbb-9aaa-4d01-8a0f-a061acf3a845","Type":"ContainerDied","Data":"182e7c202e54319407360794cf227f99f4d9087f026461b7d72de7a24ada750c"} Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.548633 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="182e7c202e54319407360794cf227f99f4d9087f026461b7d72de7a24ada750c" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.548634 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e187-account-create-update-4xb7l" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.551000 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-46kds" event={"ID":"6584bab0-12c6-4bce-99be-d38f3748f896","Type":"ContainerDied","Data":"8c9bc379e5f12ad6cc9fc4a894a6c0e8ca43c8a837f75aefd0d3827ab77c8833"} Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.551056 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c9bc379e5f12ad6cc9fc4a894a6c0e8ca43c8a837f75aefd0d3827ab77c8833" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.551118 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-46kds" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.675022 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.683361 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c3b21a02-7162-42ca-84cf-e0fa36b04a22-etc-swift\") pod \"swift-storage-0\" (UID: \"c3b21a02-7162-42ca-84cf-e0fa36b04a22\") " pod="openstack/swift-storage-0" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.689701 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9472r"] Feb 19 10:01:32 crc kubenswrapper[4873]: E0219 10:01:32.690590 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179cf76d-a15d-4bce-be42-18ad2e4abb94" containerName="mariadb-account-create-update" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.690614 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="179cf76d-a15d-4bce-be42-18ad2e4abb94" containerName="mariadb-account-create-update" Feb 19 10:01:32 crc kubenswrapper[4873]: E0219 10:01:32.690646 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af085fbb-9aaa-4d01-8a0f-a061acf3a845" containerName="mariadb-account-create-update" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.690656 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="af085fbb-9aaa-4d01-8a0f-a061acf3a845" containerName="mariadb-account-create-update" Feb 19 10:01:32 crc kubenswrapper[4873]: E0219 10:01:32.690673 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6584bab0-12c6-4bce-99be-d38f3748f896" containerName="mariadb-database-create" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.690678 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="6584bab0-12c6-4bce-99be-d38f3748f896" containerName="mariadb-database-create" Feb 19 10:01:32 crc kubenswrapper[4873]: E0219 10:01:32.690688 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1770246-951b-40da-a0a2-4320dde71437" containerName="mariadb-database-create" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.690694 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1770246-951b-40da-a0a2-4320dde71437" containerName="mariadb-database-create" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.690886 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="af085fbb-9aaa-4d01-8a0f-a061acf3a845" containerName="mariadb-account-create-update" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.690902 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1770246-951b-40da-a0a2-4320dde71437" containerName="mariadb-database-create" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.690910 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="179cf76d-a15d-4bce-be42-18ad2e4abb94" containerName="mariadb-account-create-update" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.690920 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="6584bab0-12c6-4bce-99be-d38f3748f896" containerName="mariadb-database-create" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.691518 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9472r" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.694790 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-n9qxt" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.694805 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.711656 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9472r"] Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.878144 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-config-data\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.879596 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-combined-ca-bundle\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.879689 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-db-sync-config-data\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.879750 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btfz7\" (UniqueName: \"kubernetes.io/projected/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-kube-api-access-btfz7\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.896462 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.981071 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-config-data\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.981236 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-combined-ca-bundle\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.981268 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-db-sync-config-data\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.981292 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btfz7\" (UniqueName: \"kubernetes.io/projected/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-kube-api-access-btfz7\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.987477 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-config-data\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.987802 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-db-sync-config-data\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:32 crc kubenswrapper[4873]: I0219 10:01:32.996022 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-combined-ca-bundle\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.015584 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btfz7\" (UniqueName: \"kubernetes.io/projected/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-kube-api-access-btfz7\") pod \"glance-db-sync-9472r\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " pod="openstack/glance-db-sync-9472r" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.163602 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dftzh" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.190838 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-087c-account-create-update-qnlsx" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.284567 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-operator-scripts\") pod \"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4\" (UID: \"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4\") " Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.284904 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2hmj\" (UniqueName: \"kubernetes.io/projected/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-kube-api-access-q2hmj\") pod \"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4\" (UID: \"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4\") " Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.285746 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4" (UID: "bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.290159 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-kube-api-access-q2hmj" (OuterVolumeSpecName: "kube-api-access-q2hmj") pod "bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4" (UID: "bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4"). InnerVolumeSpecName "kube-api-access-q2hmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.312835 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9472r" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.389513 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsvsx\" (UniqueName: \"kubernetes.io/projected/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-kube-api-access-nsvsx\") pod \"7b0cc2ef-89a2-4220-8b44-7fc71537ab50\" (UID: \"7b0cc2ef-89a2-4220-8b44-7fc71537ab50\") " Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.389583 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-operator-scripts\") pod \"7b0cc2ef-89a2-4220-8b44-7fc71537ab50\" (UID: \"7b0cc2ef-89a2-4220-8b44-7fc71537ab50\") " Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.389823 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.389835 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2hmj\" (UniqueName: \"kubernetes.io/projected/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4-kube-api-access-q2hmj\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.390120 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b0cc2ef-89a2-4220-8b44-7fc71537ab50" (UID: "7b0cc2ef-89a2-4220-8b44-7fc71537ab50"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.395487 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-kube-api-access-nsvsx" (OuterVolumeSpecName: "kube-api-access-nsvsx") pod "7b0cc2ef-89a2-4220-8b44-7fc71537ab50" (UID: "7b0cc2ef-89a2-4220-8b44-7fc71537ab50"). InnerVolumeSpecName "kube-api-access-nsvsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.493435 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.493907 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsvsx\" (UniqueName: \"kubernetes.io/projected/7b0cc2ef-89a2-4220-8b44-7fc71537ab50-kube-api-access-nsvsx\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.569426 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-087c-account-create-update-qnlsx" event={"ID":"7b0cc2ef-89a2-4220-8b44-7fc71537ab50","Type":"ContainerDied","Data":"3721d608d71f840cfefd584322332fd88b72dcb989cdcfcd288c6da01a5126d0"} Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.569481 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3721d608d71f840cfefd584322332fd88b72dcb989cdcfcd288c6da01a5126d0" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.569581 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-087c-account-create-update-qnlsx" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.582891 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-dftzh" event={"ID":"bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4","Type":"ContainerDied","Data":"3eb30dc7ce6a38d18c43fa6669a1fe45b71d1e88c10fa39a798d51dbbdc8535a"} Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.582937 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eb30dc7ce6a38d18c43fa6669a1fe45b71d1e88c10fa39a798d51dbbdc8535a" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.583024 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dftzh" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.636750 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/notifications-rabbitmq-server-0" podUID="da89f0ff-c51c-4c4a-8df4-f7787d29ddd2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.750955 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.951293 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9472r"] Feb 19 10:01:33 crc kubenswrapper[4873]: W0219 10:01:33.964535 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ce0a8b9_b7a8_4ee7_8d68_0e6145ada6ba.slice/crio-e77590cdf691d40582aa326fa0b9971a60cdea85c0908db700bc4cf93f06c74b WatchSource:0}: Error finding container e77590cdf691d40582aa326fa0b9971a60cdea85c0908db700bc4cf93f06c74b: Status 404 returned error can't find the container with id e77590cdf691d40582aa326fa0b9971a60cdea85c0908db700bc4cf93f06c74b Feb 19 10:01:33 crc kubenswrapper[4873]: I0219 10:01:33.978715 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="9251ac9a-275e-4622-83a2-121d59ec8cd1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Feb 19 10:01:34 crc kubenswrapper[4873]: I0219 10:01:34.326816 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="86685946-19ac-434a-974f-99b5beeda172" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Feb 19 10:01:34 crc kubenswrapper[4873]: I0219 10:01:34.596981 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9472r" event={"ID":"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba","Type":"ContainerStarted","Data":"e77590cdf691d40582aa326fa0b9971a60cdea85c0908db700bc4cf93f06c74b"} Feb 19 10:01:34 crc kubenswrapper[4873]: I0219 10:01:34.600641 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"38196c7be30e9f61335fa1f0192f051e1628db316ccf2a1bfb000ba2cf14f1d1"} Feb 19 10:01:34 crc kubenswrapper[4873]: I0219 10:01:34.600675 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"f31977bf19b6ff3e5baded4225c4014777cdf792cf2977b06fd51396f8c73011"} Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.619669 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"c42f18c86fb159a11eb5d3fb83148d0afcb1fa64baa24bda8073ec6b339ef356"} Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.620000 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"00a173a5adc7f8b49c8265c3cd1d39f7fab154d134782f52f029ed762c47ec91"} Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.620011 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"7662dd8178d787f4ea3d1a4ae3858a1da893475dae8593dd22bb93a83b9bc95b"} Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.831310 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-kvmj2"] Feb 19 10:01:35 crc kubenswrapper[4873]: E0219 10:01:35.831613 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0cc2ef-89a2-4220-8b44-7fc71537ab50" containerName="mariadb-account-create-update" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.831625 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0cc2ef-89a2-4220-8b44-7fc71537ab50" containerName="mariadb-account-create-update" Feb 19 10:01:35 crc kubenswrapper[4873]: E0219 10:01:35.831653 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4" containerName="mariadb-database-create" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.831658 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4" containerName="mariadb-database-create" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.831817 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4" containerName="mariadb-database-create" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.831836 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0cc2ef-89a2-4220-8b44-7fc71537ab50" containerName="mariadb-account-create-update" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.832365 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kvmj2" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.838769 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29a6680f-7e8e-4326-9401-fde957599477-operator-scripts\") pod \"root-account-create-update-kvmj2\" (UID: \"29a6680f-7e8e-4326-9401-fde957599477\") " pod="openstack/root-account-create-update-kvmj2" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.839142 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gbw2\" (UniqueName: \"kubernetes.io/projected/29a6680f-7e8e-4326-9401-fde957599477-kube-api-access-4gbw2\") pod \"root-account-create-update-kvmj2\" (UID: \"29a6680f-7e8e-4326-9401-fde957599477\") " pod="openstack/root-account-create-update-kvmj2" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.839949 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.841055 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kvmj2"] Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.865636 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.871206 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.953414 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gbw2\" (UniqueName: \"kubernetes.io/projected/29a6680f-7e8e-4326-9401-fde957599477-kube-api-access-4gbw2\") pod \"root-account-create-update-kvmj2\" (UID: \"29a6680f-7e8e-4326-9401-fde957599477\") " pod="openstack/root-account-create-update-kvmj2" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.953664 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29a6680f-7e8e-4326-9401-fde957599477-operator-scripts\") pod \"root-account-create-update-kvmj2\" (UID: \"29a6680f-7e8e-4326-9401-fde957599477\") " pod="openstack/root-account-create-update-kvmj2" Feb 19 10:01:35 crc kubenswrapper[4873]: I0219 10:01:35.954907 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29a6680f-7e8e-4326-9401-fde957599477-operator-scripts\") pod \"root-account-create-update-kvmj2\" (UID: \"29a6680f-7e8e-4326-9401-fde957599477\") " pod="openstack/root-account-create-update-kvmj2" Feb 19 10:01:36 crc kubenswrapper[4873]: I0219 10:01:36.002319 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gbw2\" (UniqueName: \"kubernetes.io/projected/29a6680f-7e8e-4326-9401-fde957599477-kube-api-access-4gbw2\") pod \"root-account-create-update-kvmj2\" (UID: \"29a6680f-7e8e-4326-9401-fde957599477\") " pod="openstack/root-account-create-update-kvmj2" Feb 19 10:01:36 crc kubenswrapper[4873]: I0219 10:01:36.150222 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kvmj2" Feb 19 10:01:36 crc kubenswrapper[4873]: I0219 10:01:36.638495 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:36 crc kubenswrapper[4873]: I0219 10:01:36.899273 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kvmj2"] Feb 19 10:01:36 crc kubenswrapper[4873]: W0219 10:01:36.913297 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29a6680f_7e8e_4326_9401_fde957599477.slice/crio-f3d693699de1f87f0e2baf8cbe839d21813317244d1f8a537b02c8e0a41220d6 WatchSource:0}: Error finding container f3d693699de1f87f0e2baf8cbe839d21813317244d1f8a537b02c8e0a41220d6: Status 404 returned error can't find the container with id f3d693699de1f87f0e2baf8cbe839d21813317244d1f8a537b02c8e0a41220d6 Feb 19 10:01:37 crc kubenswrapper[4873]: I0219 10:01:37.647090 4873 generic.go:334] "Generic (PLEG): container finished" podID="29a6680f-7e8e-4326-9401-fde957599477" containerID="97c6c0035f5f6c9762dd68933f3909de6f99dfa1fe212cf2c55b0644dfffdb93" exitCode=0 Feb 19 10:01:37 crc kubenswrapper[4873]: I0219 10:01:37.647306 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kvmj2" event={"ID":"29a6680f-7e8e-4326-9401-fde957599477","Type":"ContainerDied","Data":"97c6c0035f5f6c9762dd68933f3909de6f99dfa1fe212cf2c55b0644dfffdb93"} Feb 19 10:01:37 crc kubenswrapper[4873]: I0219 10:01:37.647549 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kvmj2" event={"ID":"29a6680f-7e8e-4326-9401-fde957599477","Type":"ContainerStarted","Data":"f3d693699de1f87f0e2baf8cbe839d21813317244d1f8a537b02c8e0a41220d6"} Feb 19 10:01:37 crc kubenswrapper[4873]: I0219 10:01:37.662963 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"d0826b7651dda0e678af49c63026f36047f31ce626d2571624180f0bdc91d047"} Feb 19 10:01:37 crc kubenswrapper[4873]: I0219 10:01:37.663011 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"6551ae035c3222119a9866389eed635be426d6bf76298476c494a0579571f69e"} Feb 19 10:01:37 crc kubenswrapper[4873]: I0219 10:01:37.663041 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"105e37c7b1a18d3d62b5ba80f1c54dbbad88863bd679c19a92a9d87a6935d1d9"} Feb 19 10:01:37 crc kubenswrapper[4873]: I0219 10:01:37.663053 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"5b1d1cf50f8beb3ecbd3ab03983d6ba7a17f20f00ccee08ee1f87637cf0af12b"} Feb 19 10:01:38 crc kubenswrapper[4873]: I0219 10:01:38.963778 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmkgp" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="registry-server" probeResult="failure" output=< Feb 19 10:01:38 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 10:01:38 crc kubenswrapper[4873]: > Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.371372 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kvmj2" Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.517844 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gbw2\" (UniqueName: \"kubernetes.io/projected/29a6680f-7e8e-4326-9401-fde957599477-kube-api-access-4gbw2\") pod \"29a6680f-7e8e-4326-9401-fde957599477\" (UID: \"29a6680f-7e8e-4326-9401-fde957599477\") " Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.518331 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29a6680f-7e8e-4326-9401-fde957599477-operator-scripts\") pod \"29a6680f-7e8e-4326-9401-fde957599477\" (UID: \"29a6680f-7e8e-4326-9401-fde957599477\") " Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.519576 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a6680f-7e8e-4326-9401-fde957599477-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29a6680f-7e8e-4326-9401-fde957599477" (UID: "29a6680f-7e8e-4326-9401-fde957599477"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.525113 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a6680f-7e8e-4326-9401-fde957599477-kube-api-access-4gbw2" (OuterVolumeSpecName: "kube-api-access-4gbw2") pod "29a6680f-7e8e-4326-9401-fde957599477" (UID: "29a6680f-7e8e-4326-9401-fde957599477"). InnerVolumeSpecName "kube-api-access-4gbw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.620368 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29a6680f-7e8e-4326-9401-fde957599477-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.620401 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gbw2\" (UniqueName: \"kubernetes.io/projected/29a6680f-7e8e-4326-9401-fde957599477-kube-api-access-4gbw2\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.691487 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"43241f4e3031b74ccbe2321325723ab3e532db7807ed1e957e9fd210bc9f3828"} Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.691530 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"453e54863e0fc731cb1decd734e78f4081a88821601c764cfce9b82cbbf25eda"} Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.693788 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kvmj2" event={"ID":"29a6680f-7e8e-4326-9401-fde957599477","Type":"ContainerDied","Data":"f3d693699de1f87f0e2baf8cbe839d21813317244d1f8a537b02c8e0a41220d6"} Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.693817 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3d693699de1f87f0e2baf8cbe839d21813317244d1f8a537b02c8e0a41220d6" Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.693865 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kvmj2" Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.724749 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.725167 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="prometheus" containerID="cri-o://8c1b23b8ffccae7306c11a4c3d6415747d09b3586d46bdcc68e728a0936c32a0" gracePeriod=600 Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.725255 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="config-reloader" containerID="cri-o://25a26e338e379f85c4c91e347569e9af0c97d68170a522acb020aee8d309e23c" gracePeriod=600 Feb 19 10:01:39 crc kubenswrapper[4873]: I0219 10:01:39.725275 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="thanos-sidecar" containerID="cri-o://7feaa15c2585dfd6c2a4ea45a4afeb2729895aad024065a30a27359792d333ac" gracePeriod=600 Feb 19 10:01:40 crc kubenswrapper[4873]: I0219 10:01:40.711772 4873 generic.go:334] "Generic (PLEG): container finished" podID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerID="7feaa15c2585dfd6c2a4ea45a4afeb2729895aad024065a30a27359792d333ac" exitCode=0 Feb 19 10:01:40 crc kubenswrapper[4873]: I0219 10:01:40.712239 4873 generic.go:334] "Generic (PLEG): container finished" podID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerID="25a26e338e379f85c4c91e347569e9af0c97d68170a522acb020aee8d309e23c" exitCode=0 Feb 19 10:01:40 crc kubenswrapper[4873]: I0219 10:01:40.711863 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0b46b116-4858-4b6a-b3ad-9337272f9a91","Type":"ContainerDied","Data":"7feaa15c2585dfd6c2a4ea45a4afeb2729895aad024065a30a27359792d333ac"} Feb 19 10:01:40 crc kubenswrapper[4873]: I0219 10:01:40.712285 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0b46b116-4858-4b6a-b3ad-9337272f9a91","Type":"ContainerDied","Data":"25a26e338e379f85c4c91e347569e9af0c97d68170a522acb020aee8d309e23c"} Feb 19 10:01:40 crc kubenswrapper[4873]: I0219 10:01:40.712298 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0b46b116-4858-4b6a-b3ad-9337272f9a91","Type":"ContainerDied","Data":"8c1b23b8ffccae7306c11a4c3d6415747d09b3586d46bdcc68e728a0936c32a0"} Feb 19 10:01:40 crc kubenswrapper[4873]: I0219 10:01:40.712255 4873 generic.go:334] "Generic (PLEG): container finished" podID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerID="8c1b23b8ffccae7306c11a4c3d6415747d09b3586d46bdcc68e728a0936c32a0" exitCode=0 Feb 19 10:01:40 crc kubenswrapper[4873]: I0219 10:01:40.718630 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"c294b5748f2caa3d5e07d1a9ff34e328e71a1bbd65030e32834bac700fb4e372"} Feb 19 10:01:40 crc kubenswrapper[4873]: I0219 10:01:40.718659 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"3ecd75568a15b9478406c53fa7acd0f225ffe936f015c69c59c50c3d47a6b018"} Feb 19 10:01:40 crc kubenswrapper[4873]: I0219 10:01:40.718670 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"fff6f6a1a6441e525838d9470dedf04959877dc822b5e0d1de8610b7ddb6d89e"} Feb 19 10:01:40 crc kubenswrapper[4873]: I0219 10:01:40.953424 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.066940 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-1\") pod \"0b46b116-4858-4b6a-b3ad-9337272f9a91\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.067021 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-web-config\") pod \"0b46b116-4858-4b6a-b3ad-9337272f9a91\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.067267 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"0b46b116-4858-4b6a-b3ad-9337272f9a91\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.067352 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-config\") pod \"0b46b116-4858-4b6a-b3ad-9337272f9a91\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.067413 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b46b116-4858-4b6a-b3ad-9337272f9a91-config-out\") pod \"0b46b116-4858-4b6a-b3ad-9337272f9a91\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.067442 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-tls-assets\") pod \"0b46b116-4858-4b6a-b3ad-9337272f9a91\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.067464 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-thanos-prometheus-http-client-file\") pod \"0b46b116-4858-4b6a-b3ad-9337272f9a91\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.067506 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz72n\" (UniqueName: \"kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-kube-api-access-hz72n\") pod \"0b46b116-4858-4b6a-b3ad-9337272f9a91\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.067533 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-0\") pod \"0b46b116-4858-4b6a-b3ad-9337272f9a91\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.068291 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-2\") pod \"0b46b116-4858-4b6a-b3ad-9337272f9a91\" (UID: \"0b46b116-4858-4b6a-b3ad-9337272f9a91\") " Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.067540 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "0b46b116-4858-4b6a-b3ad-9337272f9a91" (UID: "0b46b116-4858-4b6a-b3ad-9337272f9a91"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.068675 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "0b46b116-4858-4b6a-b3ad-9337272f9a91" (UID: "0b46b116-4858-4b6a-b3ad-9337272f9a91"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.069225 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "0b46b116-4858-4b6a-b3ad-9337272f9a91" (UID: "0b46b116-4858-4b6a-b3ad-9337272f9a91"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.073298 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-config" (OuterVolumeSpecName: "config") pod "0b46b116-4858-4b6a-b3ad-9337272f9a91" (UID: "0b46b116-4858-4b6a-b3ad-9337272f9a91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.075615 4873 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.075652 4873 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.075665 4873 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0b46b116-4858-4b6a-b3ad-9337272f9a91-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.075681 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.077358 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0b46b116-4858-4b6a-b3ad-9337272f9a91" (UID: "0b46b116-4858-4b6a-b3ad-9337272f9a91"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.077762 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b46b116-4858-4b6a-b3ad-9337272f9a91-config-out" (OuterVolumeSpecName: "config-out") pod "0b46b116-4858-4b6a-b3ad-9337272f9a91" (UID: "0b46b116-4858-4b6a-b3ad-9337272f9a91"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.078302 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0b46b116-4858-4b6a-b3ad-9337272f9a91" (UID: "0b46b116-4858-4b6a-b3ad-9337272f9a91"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.082310 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-kube-api-access-hz72n" (OuterVolumeSpecName: "kube-api-access-hz72n") pod "0b46b116-4858-4b6a-b3ad-9337272f9a91" (UID: "0b46b116-4858-4b6a-b3ad-9337272f9a91"). InnerVolumeSpecName "kube-api-access-hz72n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.111067 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "0b46b116-4858-4b6a-b3ad-9337272f9a91" (UID: "0b46b116-4858-4b6a-b3ad-9337272f9a91"). InnerVolumeSpecName "pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.111450 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-web-config" (OuterVolumeSpecName: "web-config") pod "0b46b116-4858-4b6a-b3ad-9337272f9a91" (UID: "0b46b116-4858-4b6a-b3ad-9337272f9a91"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.176657 4873 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.176688 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz72n\" (UniqueName: \"kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-kube-api-access-hz72n\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.176699 4873 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0b46b116-4858-4b6a-b3ad-9337272f9a91-web-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.176727 4873 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") on node \"crc\" " Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.176739 4873 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0b46b116-4858-4b6a-b3ad-9337272f9a91-config-out\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.176748 4873 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0b46b116-4858-4b6a-b3ad-9337272f9a91-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.198549 4873 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.199329 4873 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d") on node "crc" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.277715 4873 reconciler_common.go:293] "Volume detached for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") on node \"crc\" DevicePath \"\"" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.739505 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"8399a72c47aa3eb27fa21a10ff0df17158123295d4b9c5fef14b8a272df52586"} Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.740036 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c3b21a02-7162-42ca-84cf-e0fa36b04a22","Type":"ContainerStarted","Data":"aa95f52e5f68957729c900c17b88f03f170d472e846005e5e793cf71068d51c7"} Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.745471 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0b46b116-4858-4b6a-b3ad-9337272f9a91","Type":"ContainerDied","Data":"3d0b7f98084ff77ff34a64d3b9fb32fc7993ea571d51b6cb0b24962f0fd5c9ef"} Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.745505 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.745549 4873 scope.go:117] "RemoveContainer" containerID="7feaa15c2585dfd6c2a4ea45a4afeb2729895aad024065a30a27359792d333ac" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.781534 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.304896969 podStartE2EDuration="42.781513688s" podCreationTimestamp="2026-02-19 10:00:59 +0000 UTC" firstStartedPulling="2026-02-19 10:01:33.757203154 +0000 UTC m=+1003.046634792" lastFinishedPulling="2026-02-19 10:01:39.233819873 +0000 UTC m=+1008.523251511" observedRunningTime="2026-02-19 10:01:41.775851278 +0000 UTC m=+1011.065282926" watchObservedRunningTime="2026-02-19 10:01:41.781513688 +0000 UTC m=+1011.070945326" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.802175 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.810857 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.850512 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:01:41 crc kubenswrapper[4873]: E0219 10:01:41.853330 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="thanos-sidecar" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.853364 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="thanos-sidecar" Feb 19 10:01:41 crc kubenswrapper[4873]: E0219 10:01:41.853390 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a6680f-7e8e-4326-9401-fde957599477" containerName="mariadb-account-create-update" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.853401 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a6680f-7e8e-4326-9401-fde957599477" containerName="mariadb-account-create-update" Feb 19 10:01:41 crc kubenswrapper[4873]: E0219 10:01:41.853421 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="init-config-reloader" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.853429 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="init-config-reloader" Feb 19 10:01:41 crc kubenswrapper[4873]: E0219 10:01:41.853457 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="config-reloader" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.853465 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="config-reloader" Feb 19 10:01:41 crc kubenswrapper[4873]: E0219 10:01:41.853494 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="prometheus" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.853502 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="prometheus" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.855198 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="prometheus" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.855242 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="config-reloader" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.855263 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="thanos-sidecar" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.855279 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a6680f-7e8e-4326-9401-fde957599477" containerName="mariadb-account-create-update" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.861131 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.869368 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.871553 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.871695 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.871955 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.872181 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.872872 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.873736 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-stpz9" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.873924 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.890966 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 10:01:41 crc kubenswrapper[4873]: I0219 10:01:41.893888 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.004414 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.004762 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.004958 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.005126 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.005288 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.005456 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.005607 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.005771 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.005955 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.006134 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.006295 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.006494 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.006661 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snd6v\" (UniqueName: \"kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-kube-api-access-snd6v\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.100877 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f77dfd79f-tg9w4"] Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.102600 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.107700 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108387 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108410 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108446 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108476 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108493 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108514 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108539 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108559 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snd6v\" (UniqueName: \"kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-kube-api-access-snd6v\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108602 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108627 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108653 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108674 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.108696 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.111328 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.112000 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.112666 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.118547 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.121405 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.123149 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.127594 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.128754 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.130209 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.130223 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.139265 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.141044 4873 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.141092 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/668a04d4437b4137f130ddea3fc0a68c22db655664b336b39ceb124bf62a44ab/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.141521 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f77dfd79f-tg9w4"] Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.156775 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snd6v\" (UniqueName: \"kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-kube-api-access-snd6v\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.192264 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.209880 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-svc\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.209953 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-config\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.209980 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6b5\" (UniqueName: \"kubernetes.io/projected/8e8c0292-715e-4d4d-a552-5229adfc3e74-kube-api-access-2k6b5\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.210005 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-sb\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.210186 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-swift-storage-0\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.210250 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-nb\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.312081 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-config\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.312147 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6b5\" (UniqueName: \"kubernetes.io/projected/8e8c0292-715e-4d4d-a552-5229adfc3e74-kube-api-access-2k6b5\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.312194 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-sb\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.312262 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-swift-storage-0\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.312680 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-nb\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.313416 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-sb\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.313453 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-config\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.313515 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-swift-storage-0\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.313580 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-svc\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.314148 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-nb\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.314397 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-svc\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.341208 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6b5\" (UniqueName: \"kubernetes.io/projected/8e8c0292-715e-4d4d-a552-5229adfc3e74-kube-api-access-2k6b5\") pod \"dnsmasq-dns-5f77dfd79f-tg9w4\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.489893 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 10:01:42 crc kubenswrapper[4873]: I0219 10:01:42.527553 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:43 crc kubenswrapper[4873]: I0219 10:01:43.497153 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" path="/var/lib/kubelet/pods/0b46b116-4858-4b6a-b3ad-9337272f9a91/volumes" Feb 19 10:01:43 crc kubenswrapper[4873]: I0219 10:01:43.638991 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/notifications-rabbitmq-server-0" Feb 19 10:01:43 crc kubenswrapper[4873]: I0219 10:01:43.864655 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="0b46b116-4858-4b6a-b3ad-9337272f9a91" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.109:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 10:01:43 crc kubenswrapper[4873]: I0219 10:01:43.978399 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 10:01:44 crc kubenswrapper[4873]: I0219 10:01:44.327264 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.791804 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-f5jnw"] Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.793187 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5jnw" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.819307 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f5jnw"] Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.894372 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec120760-bb10-44ff-bbb0-ed1665b4e17b-operator-scripts\") pod \"barbican-db-create-f5jnw\" (UID: \"ec120760-bb10-44ff-bbb0-ed1665b4e17b\") " pod="openstack/barbican-db-create-f5jnw" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.894489 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thfk7\" (UniqueName: \"kubernetes.io/projected/ec120760-bb10-44ff-bbb0-ed1665b4e17b-kube-api-access-thfk7\") pod \"barbican-db-create-f5jnw\" (UID: \"ec120760-bb10-44ff-bbb0-ed1665b4e17b\") " pod="openstack/barbican-db-create-f5jnw" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.895040 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-63c5-account-create-update-rqftk"] Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.896177 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63c5-account-create-update-rqftk" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.899004 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.907863 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-63c5-account-create-update-rqftk"] Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.962608 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-86n9s"] Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.967180 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-86n9s" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.969576 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-27d74" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.970176 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.971041 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.971331 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.974674 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-86n9s"] Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.995489 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thfk7\" (UniqueName: \"kubernetes.io/projected/ec120760-bb10-44ff-bbb0-ed1665b4e17b-kube-api-access-thfk7\") pod \"barbican-db-create-f5jnw\" (UID: \"ec120760-bb10-44ff-bbb0-ed1665b4e17b\") " pod="openstack/barbican-db-create-f5jnw" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.995577 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/679f69ef-9960-4e33-a6aa-09baefabc417-operator-scripts\") pod \"barbican-63c5-account-create-update-rqftk\" (UID: \"679f69ef-9960-4e33-a6aa-09baefabc417\") " pod="openstack/barbican-63c5-account-create-update-rqftk" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.995616 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sht5z\" (UniqueName: \"kubernetes.io/projected/679f69ef-9960-4e33-a6aa-09baefabc417-kube-api-access-sht5z\") pod \"barbican-63c5-account-create-update-rqftk\" (UID: \"679f69ef-9960-4e33-a6aa-09baefabc417\") " pod="openstack/barbican-63c5-account-create-update-rqftk" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.995675 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec120760-bb10-44ff-bbb0-ed1665b4e17b-operator-scripts\") pod \"barbican-db-create-f5jnw\" (UID: \"ec120760-bb10-44ff-bbb0-ed1665b4e17b\") " pod="openstack/barbican-db-create-f5jnw" Feb 19 10:01:45 crc kubenswrapper[4873]: I0219 10:01:45.996405 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec120760-bb10-44ff-bbb0-ed1665b4e17b-operator-scripts\") pod \"barbican-db-create-f5jnw\" (UID: \"ec120760-bb10-44ff-bbb0-ed1665b4e17b\") " pod="openstack/barbican-db-create-f5jnw" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.011570 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-nfk5h"] Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.012764 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nfk5h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.029612 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nfk5h"] Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.037518 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thfk7\" (UniqueName: \"kubernetes.io/projected/ec120760-bb10-44ff-bbb0-ed1665b4e17b-kube-api-access-thfk7\") pod \"barbican-db-create-f5jnw\" (UID: \"ec120760-bb10-44ff-bbb0-ed1665b4e17b\") " pod="openstack/barbican-db-create-f5jnw" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.097188 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-config-data\") pod \"keystone-db-sync-86n9s\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " pod="openstack/keystone-db-sync-86n9s" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.097225 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4qhl\" (UniqueName: \"kubernetes.io/projected/735c003d-082d-431f-9906-20c8946f1bf4-kube-api-access-p4qhl\") pod \"keystone-db-sync-86n9s\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " pod="openstack/keystone-db-sync-86n9s" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.097266 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4hbs\" (UniqueName: \"kubernetes.io/projected/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-kube-api-access-j4hbs\") pod \"neutron-db-create-nfk5h\" (UID: \"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba\") " pod="openstack/neutron-db-create-nfk5h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.097309 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-operator-scripts\") pod \"neutron-db-create-nfk5h\" (UID: \"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba\") " pod="openstack/neutron-db-create-nfk5h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.097377 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/679f69ef-9960-4e33-a6aa-09baefabc417-operator-scripts\") pod \"barbican-63c5-account-create-update-rqftk\" (UID: \"679f69ef-9960-4e33-a6aa-09baefabc417\") " pod="openstack/barbican-63c5-account-create-update-rqftk" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.097397 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-combined-ca-bundle\") pod \"keystone-db-sync-86n9s\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " pod="openstack/keystone-db-sync-86n9s" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.097421 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sht5z\" (UniqueName: \"kubernetes.io/projected/679f69ef-9960-4e33-a6aa-09baefabc417-kube-api-access-sht5z\") pod \"barbican-63c5-account-create-update-rqftk\" (UID: \"679f69ef-9960-4e33-a6aa-09baefabc417\") " pod="openstack/barbican-63c5-account-create-update-rqftk" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.098236 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/679f69ef-9960-4e33-a6aa-09baefabc417-operator-scripts\") pod \"barbican-63c5-account-create-update-rqftk\" (UID: \"679f69ef-9960-4e33-a6aa-09baefabc417\") " pod="openstack/barbican-63c5-account-create-update-rqftk" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.101966 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-73d0-account-create-update-vwj8q"] Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.103179 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73d0-account-create-update-vwj8q" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.109142 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.114715 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5jnw" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.117827 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-73d0-account-create-update-vwj8q"] Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.139063 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sht5z\" (UniqueName: \"kubernetes.io/projected/679f69ef-9960-4e33-a6aa-09baefabc417-kube-api-access-sht5z\") pod \"barbican-63c5-account-create-update-rqftk\" (UID: \"679f69ef-9960-4e33-a6aa-09baefabc417\") " pod="openstack/barbican-63c5-account-create-update-rqftk" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.199077 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-combined-ca-bundle\") pod \"keystone-db-sync-86n9s\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " pod="openstack/keystone-db-sync-86n9s" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.199171 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jdc5\" (UniqueName: \"kubernetes.io/projected/fd7769ae-caf0-4f62-be96-90d6fa334259-kube-api-access-5jdc5\") pod \"neutron-73d0-account-create-update-vwj8q\" (UID: \"fd7769ae-caf0-4f62-be96-90d6fa334259\") " pod="openstack/neutron-73d0-account-create-update-vwj8q" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.199238 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-config-data\") pod \"keystone-db-sync-86n9s\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " pod="openstack/keystone-db-sync-86n9s" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.199260 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4qhl\" (UniqueName: \"kubernetes.io/projected/735c003d-082d-431f-9906-20c8946f1bf4-kube-api-access-p4qhl\") pod \"keystone-db-sync-86n9s\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " pod="openstack/keystone-db-sync-86n9s" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.199311 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4hbs\" (UniqueName: \"kubernetes.io/projected/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-kube-api-access-j4hbs\") pod \"neutron-db-create-nfk5h\" (UID: \"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba\") " pod="openstack/neutron-db-create-nfk5h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.199357 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-operator-scripts\") pod \"neutron-db-create-nfk5h\" (UID: \"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba\") " pod="openstack/neutron-db-create-nfk5h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.199393 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd7769ae-caf0-4f62-be96-90d6fa334259-operator-scripts\") pod \"neutron-73d0-account-create-update-vwj8q\" (UID: \"fd7769ae-caf0-4f62-be96-90d6fa334259\") " pod="openstack/neutron-73d0-account-create-update-vwj8q" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.200295 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-operator-scripts\") pod \"neutron-db-create-nfk5h\" (UID: \"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba\") " pod="openstack/neutron-db-create-nfk5h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.203748 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-combined-ca-bundle\") pod \"keystone-db-sync-86n9s\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " pod="openstack/keystone-db-sync-86n9s" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.216797 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63c5-account-create-update-rqftk" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.218927 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-config-data\") pod \"keystone-db-sync-86n9s\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " pod="openstack/keystone-db-sync-86n9s" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.228521 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4hbs\" (UniqueName: \"kubernetes.io/projected/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-kube-api-access-j4hbs\") pod \"neutron-db-create-nfk5h\" (UID: \"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba\") " pod="openstack/neutron-db-create-nfk5h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.243573 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4qhl\" (UniqueName: \"kubernetes.io/projected/735c003d-082d-431f-9906-20c8946f1bf4-kube-api-access-p4qhl\") pod \"keystone-db-sync-86n9s\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " pod="openstack/keystone-db-sync-86n9s" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.292442 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-86n9s" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.300830 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jdc5\" (UniqueName: \"kubernetes.io/projected/fd7769ae-caf0-4f62-be96-90d6fa334259-kube-api-access-5jdc5\") pod \"neutron-73d0-account-create-update-vwj8q\" (UID: \"fd7769ae-caf0-4f62-be96-90d6fa334259\") " pod="openstack/neutron-73d0-account-create-update-vwj8q" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.301001 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd7769ae-caf0-4f62-be96-90d6fa334259-operator-scripts\") pod \"neutron-73d0-account-create-update-vwj8q\" (UID: \"fd7769ae-caf0-4f62-be96-90d6fa334259\") " pod="openstack/neutron-73d0-account-create-update-vwj8q" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.301738 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd7769ae-caf0-4f62-be96-90d6fa334259-operator-scripts\") pod \"neutron-73d0-account-create-update-vwj8q\" (UID: \"fd7769ae-caf0-4f62-be96-90d6fa334259\") " pod="openstack/neutron-73d0-account-create-update-vwj8q" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.318334 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jdc5\" (UniqueName: \"kubernetes.io/projected/fd7769ae-caf0-4f62-be96-90d6fa334259-kube-api-access-5jdc5\") pod \"neutron-73d0-account-create-update-vwj8q\" (UID: \"fd7769ae-caf0-4f62-be96-90d6fa334259\") " pod="openstack/neutron-73d0-account-create-update-vwj8q" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.327070 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nfk5h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.400622 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-w65h5"] Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.403139 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w65h5" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.416504 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-w65h5"] Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.427478 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73d0-account-create-update-vwj8q" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.502372 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-45f0-account-create-update-b4rvj"] Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.503452 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-45f0-account-create-update-b4rvj" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.508013 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6wv4\" (UniqueName: \"kubernetes.io/projected/e1f97f25-d006-40d7-a090-ab45ab11b282-kube-api-access-m6wv4\") pod \"cinder-db-create-w65h5\" (UID: \"e1f97f25-d006-40d7-a090-ab45ab11b282\") " pod="openstack/cinder-db-create-w65h5" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.508201 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1f97f25-d006-40d7-a090-ab45ab11b282-operator-scripts\") pod \"cinder-db-create-w65h5\" (UID: \"e1f97f25-d006-40d7-a090-ab45ab11b282\") " pod="openstack/cinder-db-create-w65h5" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.519126 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.524746 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-45f0-account-create-update-b4rvj"] Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.611224 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6wv4\" (UniqueName: \"kubernetes.io/projected/e1f97f25-d006-40d7-a090-ab45ab11b282-kube-api-access-m6wv4\") pod \"cinder-db-create-w65h5\" (UID: \"e1f97f25-d006-40d7-a090-ab45ab11b282\") " pod="openstack/cinder-db-create-w65h5" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.611316 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj5pl\" (UniqueName: \"kubernetes.io/projected/bf0daf0d-c150-49de-98af-3f65dd78112f-kube-api-access-tj5pl\") pod \"cinder-45f0-account-create-update-b4rvj\" (UID: \"bf0daf0d-c150-49de-98af-3f65dd78112f\") " pod="openstack/cinder-45f0-account-create-update-b4rvj" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.611372 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0daf0d-c150-49de-98af-3f65dd78112f-operator-scripts\") pod \"cinder-45f0-account-create-update-b4rvj\" (UID: \"bf0daf0d-c150-49de-98af-3f65dd78112f\") " pod="openstack/cinder-45f0-account-create-update-b4rvj" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.611407 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1f97f25-d006-40d7-a090-ab45ab11b282-operator-scripts\") pod \"cinder-db-create-w65h5\" (UID: \"e1f97f25-d006-40d7-a090-ab45ab11b282\") " pod="openstack/cinder-db-create-w65h5" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.612246 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1f97f25-d006-40d7-a090-ab45ab11b282-operator-scripts\") pod \"cinder-db-create-w65h5\" (UID: \"e1f97f25-d006-40d7-a090-ab45ab11b282\") " pod="openstack/cinder-db-create-w65h5" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.647009 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6wv4\" (UniqueName: \"kubernetes.io/projected/e1f97f25-d006-40d7-a090-ab45ab11b282-kube-api-access-m6wv4\") pod \"cinder-db-create-w65h5\" (UID: \"e1f97f25-d006-40d7-a090-ab45ab11b282\") " pod="openstack/cinder-db-create-w65h5" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.691954 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-k6j2h"] Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.692996 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.695864 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-m755p" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.702405 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.713076 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-combined-ca-bundle\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.713169 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj5pl\" (UniqueName: \"kubernetes.io/projected/bf0daf0d-c150-49de-98af-3f65dd78112f-kube-api-access-tj5pl\") pod \"cinder-45f0-account-create-update-b4rvj\" (UID: \"bf0daf0d-c150-49de-98af-3f65dd78112f\") " pod="openstack/cinder-45f0-account-create-update-b4rvj" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.713244 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0daf0d-c150-49de-98af-3f65dd78112f-operator-scripts\") pod \"cinder-45f0-account-create-update-b4rvj\" (UID: \"bf0daf0d-c150-49de-98af-3f65dd78112f\") " pod="openstack/cinder-45f0-account-create-update-b4rvj" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.713282 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-db-sync-config-data\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.713309 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-config-data\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.713343 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t66wv\" (UniqueName: \"kubernetes.io/projected/a075072a-1153-4963-91c7-e9e2aa08f988-kube-api-access-t66wv\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.714365 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0daf0d-c150-49de-98af-3f65dd78112f-operator-scripts\") pod \"cinder-45f0-account-create-update-b4rvj\" (UID: \"bf0daf0d-c150-49de-98af-3f65dd78112f\") " pod="openstack/cinder-45f0-account-create-update-b4rvj" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.718458 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-k6j2h"] Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.739520 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w65h5" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.751248 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj5pl\" (UniqueName: \"kubernetes.io/projected/bf0daf0d-c150-49de-98af-3f65dd78112f-kube-api-access-tj5pl\") pod \"cinder-45f0-account-create-update-b4rvj\" (UID: \"bf0daf0d-c150-49de-98af-3f65dd78112f\") " pod="openstack/cinder-45f0-account-create-update-b4rvj" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.814788 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t66wv\" (UniqueName: \"kubernetes.io/projected/a075072a-1153-4963-91c7-e9e2aa08f988-kube-api-access-t66wv\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.814882 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-combined-ca-bundle\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.814941 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-db-sync-config-data\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.814969 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-config-data\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.820519 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-db-sync-config-data\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.820565 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-config-data\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.828297 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-combined-ca-bundle\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.833514 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-45f0-account-create-update-b4rvj" Feb 19 10:01:46 crc kubenswrapper[4873]: I0219 10:01:46.853705 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t66wv\" (UniqueName: \"kubernetes.io/projected/a075072a-1153-4963-91c7-e9e2aa08f988-kube-api-access-t66wv\") pod \"watcher-db-sync-k6j2h\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:47 crc kubenswrapper[4873]: I0219 10:01:47.011677 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:01:48 crc kubenswrapper[4873]: I0219 10:01:48.969354 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmkgp" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="registry-server" probeResult="failure" output=< Feb 19 10:01:48 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 10:01:48 crc kubenswrapper[4873]: > Feb 19 10:01:51 crc kubenswrapper[4873]: I0219 10:01:51.715185 4873 scope.go:117] "RemoveContainer" containerID="25a26e338e379f85c4c91e347569e9af0c97d68170a522acb020aee8d309e23c" Feb 19 10:01:51 crc kubenswrapper[4873]: E0219 10:01:51.736195 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Feb 19 10:01:51 crc kubenswrapper[4873]: E0219 10:01:51.736265 4873 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Feb 19 10:01:51 crc kubenswrapper[4873]: E0219 10:01:51.736441 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:38.102.83.20:5001/podified-master-centos10/openstack-glance-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btfz7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-9472r_openstack(7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 10:01:51 crc kubenswrapper[4873]: E0219 10:01:51.737634 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-9472r" podUID="7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba" Feb 19 10:01:51 crc kubenswrapper[4873]: E0219 10:01:51.873866 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.20:5001/podified-master-centos10/openstack-glance-api:watcher_latest\\\"\"" pod="openstack/glance-db-sync-9472r" podUID="7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba" Feb 19 10:01:51 crc kubenswrapper[4873]: I0219 10:01:51.874254 4873 scope.go:117] "RemoveContainer" containerID="8c1b23b8ffccae7306c11a4c3d6415747d09b3586d46bdcc68e728a0936c32a0" Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.097149 4873 scope.go:117] "RemoveContainer" containerID="7b07d5d4936c52da51ddc101b9f0cc93c881b6b0ec5f359e20ad961f5451f0a9" Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.428487 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-73d0-account-create-update-vwj8q"] Feb 19 10:01:52 crc kubenswrapper[4873]: W0219 10:01:52.435724 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd7769ae_caf0_4f62_be96_90d6fa334259.slice/crio-37e3d42b2018f2672ea117a6adcb9cd16be3ac6b642d4d999186c536331b880a WatchSource:0}: Error finding container 37e3d42b2018f2672ea117a6adcb9cd16be3ac6b642d4d999186c536331b880a: Status 404 returned error can't find the container with id 37e3d42b2018f2672ea117a6adcb9cd16be3ac6b642d4d999186c536331b880a Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.438571 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f5jnw"] Feb 19 10:01:52 crc kubenswrapper[4873]: W0219 10:01:52.440479 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec120760_bb10_44ff_bbb0_ed1665b4e17b.slice/crio-dca669af7f7e92df448e624a6bcc1f9ab5d919831ea4f5a0d25f76e099c94782 WatchSource:0}: Error finding container dca669af7f7e92df448e624a6bcc1f9ab5d919831ea4f5a0d25f76e099c94782: Status 404 returned error can't find the container with id dca669af7f7e92df448e624a6bcc1f9ab5d919831ea4f5a0d25f76e099c94782 Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.441223 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.448357 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.453490 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-63c5-account-create-update-rqftk"] Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.600081 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-86n9s"] Feb 19 10:01:52 crc kubenswrapper[4873]: W0219 10:01:52.603931 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94c35b26_7dc1_4cea_bbe7_53a9e47df7ba.slice/crio-6fbd3a2c469215faf7d0f0cdd4481319d382cf3f96f79d5bbcd5880004e38acc WatchSource:0}: Error finding container 6fbd3a2c469215faf7d0f0cdd4481319d382cf3f96f79d5bbcd5880004e38acc: Status 404 returned error can't find the container with id 6fbd3a2c469215faf7d0f0cdd4481319d382cf3f96f79d5bbcd5880004e38acc Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.652855 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-w65h5"] Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.667159 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.691403 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nfk5h"] Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.711699 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.721024 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-k6j2h"] Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.730413 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-45f0-account-create-update-b4rvj"] Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.806878 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f77dfd79f-tg9w4"] Feb 19 10:01:52 crc kubenswrapper[4873]: W0219 10:01:52.825700 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e8c0292_715e_4d4d_a552_5229adfc3e74.slice/crio-04cdcce41cf06a6a4c22d13c5a42c60370cf2135128656d07548457adad958ae WatchSource:0}: Error finding container 04cdcce41cf06a6a4c22d13c5a42c60370cf2135128656d07548457adad958ae: Status 404 returned error can't find the container with id 04cdcce41cf06a6a4c22d13c5a42c60370cf2135128656d07548457adad958ae Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.863444 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nfk5h" event={"ID":"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba","Type":"ContainerStarted","Data":"6fbd3a2c469215faf7d0f0cdd4481319d382cf3f96f79d5bbcd5880004e38acc"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.865697 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d1070e0c-7518-4d1b-bbb8-e56db1cad28a","Type":"ContainerStarted","Data":"67a74c25d6b44ccd6cb397b300a6cd2025bf7fa88890d389ef81197cfb4ef22d"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.867515 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w65h5" event={"ID":"e1f97f25-d006-40d7-a090-ab45ab11b282","Type":"ContainerStarted","Data":"c2bf0fbba2f57f774b52845ad4c11caabc24a49278d94ff2cd48b137e6aeb541"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.867551 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w65h5" event={"ID":"e1f97f25-d006-40d7-a090-ab45ab11b282","Type":"ContainerStarted","Data":"6ff99127cb67f2960d3ae8f41a09b021685c8e4644ccb65876e46a1f18c275f3"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.872272 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-86n9s" event={"ID":"735c003d-082d-431f-9906-20c8946f1bf4","Type":"ContainerStarted","Data":"e7a7027c2775e72cf8a30257c067cda81d18ed0d292fcd29a26190fc119c919e"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.877523 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-k6j2h" event={"ID":"a075072a-1153-4963-91c7-e9e2aa08f988","Type":"ContainerStarted","Data":"5743e2884f1a9a199fc04d17b81aca13969dd3eeccc42a1ad3851609621f80db"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.879413 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-45f0-account-create-update-b4rvj" event={"ID":"bf0daf0d-c150-49de-98af-3f65dd78112f","Type":"ContainerStarted","Data":"eb552ca98ac78dec98734f67b2cfc6dd764f57bca299edb1d904e9f18a03a5a9"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.887501 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-63c5-account-create-update-rqftk" event={"ID":"679f69ef-9960-4e33-a6aa-09baefabc417","Type":"ContainerStarted","Data":"c237b902a14f464df461cba85ac2f3875c00ea9082d53eccd8620f9ff36dbdf1"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.887538 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-63c5-account-create-update-rqftk" event={"ID":"679f69ef-9960-4e33-a6aa-09baefabc417","Type":"ContainerStarted","Data":"8ba33e912f464ea0722df73f3baa819208f1910932806de73fe2ef0c44ba7498"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.892931 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" event={"ID":"8e8c0292-715e-4d4d-a552-5229adfc3e74","Type":"ContainerStarted","Data":"04cdcce41cf06a6a4c22d13c5a42c60370cf2135128656d07548457adad958ae"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.894677 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-w65h5" podStartSLOduration=6.894663674 podStartE2EDuration="6.894663674s" podCreationTimestamp="2026-02-19 10:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:01:52.885838785 +0000 UTC m=+1022.175270423" watchObservedRunningTime="2026-02-19 10:01:52.894663674 +0000 UTC m=+1022.184095312" Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.895484 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73d0-account-create-update-vwj8q" event={"ID":"fd7769ae-caf0-4f62-be96-90d6fa334259","Type":"ContainerStarted","Data":"fc653719445b1a5b46a480337ffb17668e9e8b070f487de6554f5c2e305620c3"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.895510 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73d0-account-create-update-vwj8q" event={"ID":"fd7769ae-caf0-4f62-be96-90d6fa334259","Type":"ContainerStarted","Data":"37e3d42b2018f2672ea117a6adcb9cd16be3ac6b642d4d999186c536331b880a"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.897839 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5jnw" event={"ID":"ec120760-bb10-44ff-bbb0-ed1665b4e17b","Type":"ContainerStarted","Data":"270c7b1e210c60f9930081568a2a368a094d153be46a4131b2c800c6cabb0758"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.897869 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5jnw" event={"ID":"ec120760-bb10-44ff-bbb0-ed1665b4e17b","Type":"ContainerStarted","Data":"dca669af7f7e92df448e624a6bcc1f9ab5d919831ea4f5a0d25f76e099c94782"} Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.905880 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-63c5-account-create-update-rqftk" podStartSLOduration=7.905857902 podStartE2EDuration="7.905857902s" podCreationTimestamp="2026-02-19 10:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:01:52.899880774 +0000 UTC m=+1022.189312412" watchObservedRunningTime="2026-02-19 10:01:52.905857902 +0000 UTC m=+1022.195289540" Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.930435 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-f5jnw" podStartSLOduration=7.930392991 podStartE2EDuration="7.930392991s" podCreationTimestamp="2026-02-19 10:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:01:52.927184851 +0000 UTC m=+1022.216616489" watchObservedRunningTime="2026-02-19 10:01:52.930392991 +0000 UTC m=+1022.219824629" Feb 19 10:01:52 crc kubenswrapper[4873]: I0219 10:01:52.946127 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-73d0-account-create-update-vwj8q" podStartSLOduration=6.94607835 podStartE2EDuration="6.94607835s" podCreationTimestamp="2026-02-19 10:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:01:52.940348678 +0000 UTC m=+1022.229780316" watchObservedRunningTime="2026-02-19 10:01:52.94607835 +0000 UTC m=+1022.235509988" Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.907095 4873 generic.go:334] "Generic (PLEG): container finished" podID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerID="c38f23c9308a52dc889562a59a6b3d3134f3aebd40d9ab2a2804a839bf127153" exitCode=0 Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.907328 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" event={"ID":"8e8c0292-715e-4d4d-a552-5229adfc3e74","Type":"ContainerDied","Data":"c38f23c9308a52dc889562a59a6b3d3134f3aebd40d9ab2a2804a839bf127153"} Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.912042 4873 generic.go:334] "Generic (PLEG): container finished" podID="bf0daf0d-c150-49de-98af-3f65dd78112f" containerID="88965eb31897e7c9f4b9aa04da422e3396b97ead67a5f74aaa92bd82cf049dc5" exitCode=0 Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.912287 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-45f0-account-create-update-b4rvj" event={"ID":"bf0daf0d-c150-49de-98af-3f65dd78112f","Type":"ContainerDied","Data":"88965eb31897e7c9f4b9aa04da422e3396b97ead67a5f74aaa92bd82cf049dc5"} Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.928745 4873 generic.go:334] "Generic (PLEG): container finished" podID="fd7769ae-caf0-4f62-be96-90d6fa334259" containerID="fc653719445b1a5b46a480337ffb17668e9e8b070f487de6554f5c2e305620c3" exitCode=0 Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.928820 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73d0-account-create-update-vwj8q" event={"ID":"fd7769ae-caf0-4f62-be96-90d6fa334259","Type":"ContainerDied","Data":"fc653719445b1a5b46a480337ffb17668e9e8b070f487de6554f5c2e305620c3"} Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.942923 4873 generic.go:334] "Generic (PLEG): container finished" podID="ec120760-bb10-44ff-bbb0-ed1665b4e17b" containerID="270c7b1e210c60f9930081568a2a368a094d153be46a4131b2c800c6cabb0758" exitCode=0 Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.942986 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5jnw" event={"ID":"ec120760-bb10-44ff-bbb0-ed1665b4e17b","Type":"ContainerDied","Data":"270c7b1e210c60f9930081568a2a368a094d153be46a4131b2c800c6cabb0758"} Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.959269 4873 generic.go:334] "Generic (PLEG): container finished" podID="e1f97f25-d006-40d7-a090-ab45ab11b282" containerID="c2bf0fbba2f57f774b52845ad4c11caabc24a49278d94ff2cd48b137e6aeb541" exitCode=0 Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.959346 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w65h5" event={"ID":"e1f97f25-d006-40d7-a090-ab45ab11b282","Type":"ContainerDied","Data":"c2bf0fbba2f57f774b52845ad4c11caabc24a49278d94ff2cd48b137e6aeb541"} Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.967868 4873 generic.go:334] "Generic (PLEG): container finished" podID="679f69ef-9960-4e33-a6aa-09baefabc417" containerID="c237b902a14f464df461cba85ac2f3875c00ea9082d53eccd8620f9ff36dbdf1" exitCode=0 Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.967921 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-63c5-account-create-update-rqftk" event={"ID":"679f69ef-9960-4e33-a6aa-09baefabc417","Type":"ContainerDied","Data":"c237b902a14f464df461cba85ac2f3875c00ea9082d53eccd8620f9ff36dbdf1"} Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.974128 4873 generic.go:334] "Generic (PLEG): container finished" podID="94c35b26-7dc1-4cea-bbe7-53a9e47df7ba" containerID="7d9cb5cecd99aa90e0a6558ac0a3e7fa7ae0c94550c983a65f7942335964abac" exitCode=0 Feb 19 10:01:53 crc kubenswrapper[4873]: I0219 10:01:53.974155 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nfk5h" event={"ID":"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba","Type":"ContainerDied","Data":"7d9cb5cecd99aa90e0a6558ac0a3e7fa7ae0c94550c983a65f7942335964abac"} Feb 19 10:01:54 crc kubenswrapper[4873]: I0219 10:01:54.987418 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" event={"ID":"8e8c0292-715e-4d4d-a552-5229adfc3e74","Type":"ContainerStarted","Data":"3202df88b506237a1560baea9fc86854fa472069f50ad6c6f94a7855eaa6ff1a"} Feb 19 10:01:54 crc kubenswrapper[4873]: I0219 10:01:54.988190 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:01:55 crc kubenswrapper[4873]: I0219 10:01:55.011720 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" podStartSLOduration=13.011701127 podStartE2EDuration="13.011701127s" podCreationTimestamp="2026-02-19 10:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:01:55.009387479 +0000 UTC m=+1024.298819127" watchObservedRunningTime="2026-02-19 10:01:55.011701127 +0000 UTC m=+1024.301132775" Feb 19 10:01:55 crc kubenswrapper[4873]: I0219 10:01:55.995473 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d1070e0c-7518-4d1b-bbb8-e56db1cad28a","Type":"ContainerStarted","Data":"5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52"} Feb 19 10:01:57 crc kubenswrapper[4873]: I0219 10:01:57.979868 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:01:58 crc kubenswrapper[4873]: I0219 10:01:58.060438 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:01:58 crc kubenswrapper[4873]: I0219 10:01:58.225438 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmkgp"] Feb 19 10:01:59 crc kubenswrapper[4873]: I0219 10:01:59.043620 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lmkgp" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="registry-server" containerID="cri-o://a571f36e40c7ad937de6c80c4fe2960d3a56ca51d705e8212a16e2333ee169bb" gracePeriod=2 Feb 19 10:02:00 crc kubenswrapper[4873]: I0219 10:02:00.052877 4873 generic.go:334] "Generic (PLEG): container finished" podID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerID="a571f36e40c7ad937de6c80c4fe2960d3a56ca51d705e8212a16e2333ee169bb" exitCode=0 Feb 19 10:02:00 crc kubenswrapper[4873]: I0219 10:02:00.052920 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmkgp" event={"ID":"d115a791-c703-4c6e-91e5-8f3ab9608277","Type":"ContainerDied","Data":"a571f36e40c7ad937de6c80c4fe2960d3a56ca51d705e8212a16e2333ee169bb"} Feb 19 10:02:01 crc kubenswrapper[4873]: I0219 10:02:01.464955 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5jnw" Feb 19 10:02:01 crc kubenswrapper[4873]: I0219 10:02:01.479175 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-45f0-account-create-update-b4rvj" Feb 19 10:02:01 crc kubenswrapper[4873]: I0219 10:02:01.495319 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nfk5h" Feb 19 10:02:01 crc kubenswrapper[4873]: I0219 10:02:01.498081 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63c5-account-create-update-rqftk" Feb 19 10:02:01 crc kubenswrapper[4873]: I0219 10:02:01.500399 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73d0-account-create-update-vwj8q" Feb 19 10:02:01 crc kubenswrapper[4873]: I0219 10:02:01.520738 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w65h5" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.620437 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1f97f25-d006-40d7-a090-ab45ab11b282-operator-scripts\") pod \"e1f97f25-d006-40d7-a090-ab45ab11b282\" (UID: \"e1f97f25-d006-40d7-a090-ab45ab11b282\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.620518 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec120760-bb10-44ff-bbb0-ed1665b4e17b-operator-scripts\") pod \"ec120760-bb10-44ff-bbb0-ed1665b4e17b\" (UID: \"ec120760-bb10-44ff-bbb0-ed1665b4e17b\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.620558 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd7769ae-caf0-4f62-be96-90d6fa334259-operator-scripts\") pod \"fd7769ae-caf0-4f62-be96-90d6fa334259\" (UID: \"fd7769ae-caf0-4f62-be96-90d6fa334259\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.620604 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jdc5\" (UniqueName: \"kubernetes.io/projected/fd7769ae-caf0-4f62-be96-90d6fa334259-kube-api-access-5jdc5\") pod \"fd7769ae-caf0-4f62-be96-90d6fa334259\" (UID: \"fd7769ae-caf0-4f62-be96-90d6fa334259\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.620631 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0daf0d-c150-49de-98af-3f65dd78112f-operator-scripts\") pod \"bf0daf0d-c150-49de-98af-3f65dd78112f\" (UID: \"bf0daf0d-c150-49de-98af-3f65dd78112f\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.620664 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/679f69ef-9960-4e33-a6aa-09baefabc417-operator-scripts\") pod \"679f69ef-9960-4e33-a6aa-09baefabc417\" (UID: \"679f69ef-9960-4e33-a6aa-09baefabc417\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.620735 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sht5z\" (UniqueName: \"kubernetes.io/projected/679f69ef-9960-4e33-a6aa-09baefabc417-kube-api-access-sht5z\") pod \"679f69ef-9960-4e33-a6aa-09baefabc417\" (UID: \"679f69ef-9960-4e33-a6aa-09baefabc417\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.620786 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thfk7\" (UniqueName: \"kubernetes.io/projected/ec120760-bb10-44ff-bbb0-ed1665b4e17b-kube-api-access-thfk7\") pod \"ec120760-bb10-44ff-bbb0-ed1665b4e17b\" (UID: \"ec120760-bb10-44ff-bbb0-ed1665b4e17b\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.620901 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-operator-scripts\") pod \"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba\" (UID: \"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.620955 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4hbs\" (UniqueName: \"kubernetes.io/projected/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-kube-api-access-j4hbs\") pod \"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba\" (UID: \"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.621011 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6wv4\" (UniqueName: \"kubernetes.io/projected/e1f97f25-d006-40d7-a090-ab45ab11b282-kube-api-access-m6wv4\") pod \"e1f97f25-d006-40d7-a090-ab45ab11b282\" (UID: \"e1f97f25-d006-40d7-a090-ab45ab11b282\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.621052 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj5pl\" (UniqueName: \"kubernetes.io/projected/bf0daf0d-c150-49de-98af-3f65dd78112f-kube-api-access-tj5pl\") pod \"bf0daf0d-c150-49de-98af-3f65dd78112f\" (UID: \"bf0daf0d-c150-49de-98af-3f65dd78112f\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.621937 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f97f25-d006-40d7-a090-ab45ab11b282-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e1f97f25-d006-40d7-a090-ab45ab11b282" (UID: "e1f97f25-d006-40d7-a090-ab45ab11b282"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.622448 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/679f69ef-9960-4e33-a6aa-09baefabc417-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "679f69ef-9960-4e33-a6aa-09baefabc417" (UID: "679f69ef-9960-4e33-a6aa-09baefabc417"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.622866 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec120760-bb10-44ff-bbb0-ed1665b4e17b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec120760-bb10-44ff-bbb0-ed1665b4e17b" (UID: "ec120760-bb10-44ff-bbb0-ed1665b4e17b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.623333 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd7769ae-caf0-4f62-be96-90d6fa334259-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd7769ae-caf0-4f62-be96-90d6fa334259" (UID: "fd7769ae-caf0-4f62-be96-90d6fa334259"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.624287 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1f97f25-d006-40d7-a090-ab45ab11b282-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.624319 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec120760-bb10-44ff-bbb0-ed1665b4e17b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.624333 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd7769ae-caf0-4f62-be96-90d6fa334259-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.624345 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/679f69ef-9960-4e33-a6aa-09baefabc417-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.629972 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/679f69ef-9960-4e33-a6aa-09baefabc417-kube-api-access-sht5z" (OuterVolumeSpecName: "kube-api-access-sht5z") pod "679f69ef-9960-4e33-a6aa-09baefabc417" (UID: "679f69ef-9960-4e33-a6aa-09baefabc417"). InnerVolumeSpecName "kube-api-access-sht5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.631566 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf0daf0d-c150-49de-98af-3f65dd78112f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf0daf0d-c150-49de-98af-3f65dd78112f" (UID: "bf0daf0d-c150-49de-98af-3f65dd78112f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.631723 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd7769ae-caf0-4f62-be96-90d6fa334259-kube-api-access-5jdc5" (OuterVolumeSpecName: "kube-api-access-5jdc5") pod "fd7769ae-caf0-4f62-be96-90d6fa334259" (UID: "fd7769ae-caf0-4f62-be96-90d6fa334259"). InnerVolumeSpecName "kube-api-access-5jdc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.632626 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94c35b26-7dc1-4cea-bbe7-53a9e47df7ba" (UID: "94c35b26-7dc1-4cea-bbe7-53a9e47df7ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.633956 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec120760-bb10-44ff-bbb0-ed1665b4e17b-kube-api-access-thfk7" (OuterVolumeSpecName: "kube-api-access-thfk7") pod "ec120760-bb10-44ff-bbb0-ed1665b4e17b" (UID: "ec120760-bb10-44ff-bbb0-ed1665b4e17b"). InnerVolumeSpecName "kube-api-access-thfk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.636224 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f97f25-d006-40d7-a090-ab45ab11b282-kube-api-access-m6wv4" (OuterVolumeSpecName: "kube-api-access-m6wv4") pod "e1f97f25-d006-40d7-a090-ab45ab11b282" (UID: "e1f97f25-d006-40d7-a090-ab45ab11b282"). InnerVolumeSpecName "kube-api-access-m6wv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.637760 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-kube-api-access-j4hbs" (OuterVolumeSpecName: "kube-api-access-j4hbs") pod "94c35b26-7dc1-4cea-bbe7-53a9e47df7ba" (UID: "94c35b26-7dc1-4cea-bbe7-53a9e47df7ba"). InnerVolumeSpecName "kube-api-access-j4hbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.640906 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf0daf0d-c150-49de-98af-3f65dd78112f-kube-api-access-tj5pl" (OuterVolumeSpecName: "kube-api-access-tj5pl") pod "bf0daf0d-c150-49de-98af-3f65dd78112f" (UID: "bf0daf0d-c150-49de-98af-3f65dd78112f"). InnerVolumeSpecName "kube-api-access-tj5pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.726229 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj5pl\" (UniqueName: \"kubernetes.io/projected/bf0daf0d-c150-49de-98af-3f65dd78112f-kube-api-access-tj5pl\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.726255 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jdc5\" (UniqueName: \"kubernetes.io/projected/fd7769ae-caf0-4f62-be96-90d6fa334259-kube-api-access-5jdc5\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.726266 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0daf0d-c150-49de-98af-3f65dd78112f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.726274 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sht5z\" (UniqueName: \"kubernetes.io/projected/679f69ef-9960-4e33-a6aa-09baefabc417-kube-api-access-sht5z\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.726283 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thfk7\" (UniqueName: \"kubernetes.io/projected/ec120760-bb10-44ff-bbb0-ed1665b4e17b-kube-api-access-thfk7\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.726291 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.726300 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4hbs\" (UniqueName: \"kubernetes.io/projected/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba-kube-api-access-j4hbs\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:01.726309 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6wv4\" (UniqueName: \"kubernetes.io/projected/e1f97f25-d006-40d7-a090-ab45ab11b282-kube-api-access-m6wv4\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.075772 4873 generic.go:334] "Generic (PLEG): container finished" podID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerID="5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52" exitCode=0 Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.075835 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d1070e0c-7518-4d1b-bbb8-e56db1cad28a","Type":"ContainerDied","Data":"5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52"} Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.078213 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nfk5h" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.078202 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nfk5h" event={"ID":"94c35b26-7dc1-4cea-bbe7-53a9e47df7ba","Type":"ContainerDied","Data":"6fbd3a2c469215faf7d0f0cdd4481319d382cf3f96f79d5bbcd5880004e38acc"} Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.078339 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fbd3a2c469215faf7d0f0cdd4481319d382cf3f96f79d5bbcd5880004e38acc" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.091152 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-45f0-account-create-update-b4rvj" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.091183 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-45f0-account-create-update-b4rvj" event={"ID":"bf0daf0d-c150-49de-98af-3f65dd78112f","Type":"ContainerDied","Data":"eb552ca98ac78dec98734f67b2cfc6dd764f57bca299edb1d904e9f18a03a5a9"} Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.091253 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb552ca98ac78dec98734f67b2cfc6dd764f57bca299edb1d904e9f18a03a5a9" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.092784 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73d0-account-create-update-vwj8q" event={"ID":"fd7769ae-caf0-4f62-be96-90d6fa334259","Type":"ContainerDied","Data":"37e3d42b2018f2672ea117a6adcb9cd16be3ac6b642d4d999186c536331b880a"} Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.092829 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37e3d42b2018f2672ea117a6adcb9cd16be3ac6b642d4d999186c536331b880a" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.092841 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73d0-account-create-update-vwj8q" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.094157 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5jnw" event={"ID":"ec120760-bb10-44ff-bbb0-ed1665b4e17b","Type":"ContainerDied","Data":"dca669af7f7e92df448e624a6bcc1f9ab5d919831ea4f5a0d25f76e099c94782"} Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.094177 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dca669af7f7e92df448e624a6bcc1f9ab5d919831ea4f5a0d25f76e099c94782" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.094226 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5jnw" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.095508 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w65h5" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.095524 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w65h5" event={"ID":"e1f97f25-d006-40d7-a090-ab45ab11b282","Type":"ContainerDied","Data":"6ff99127cb67f2960d3ae8f41a09b021685c8e4644ccb65876e46a1f18c275f3"} Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.095602 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ff99127cb67f2960d3ae8f41a09b021685c8e4644ccb65876e46a1f18c275f3" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.097244 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-63c5-account-create-update-rqftk" event={"ID":"679f69ef-9960-4e33-a6aa-09baefabc417","Type":"ContainerDied","Data":"8ba33e912f464ea0722df73f3baa819208f1910932806de73fe2ef0c44ba7498"} Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.097265 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ba33e912f464ea0722df73f3baa819208f1910932806de73fe2ef0c44ba7498" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.097308 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-63c5-account-create-update-rqftk" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.277494 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.438429 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-catalog-content\") pod \"d115a791-c703-4c6e-91e5-8f3ab9608277\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.438979 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-utilities\") pod \"d115a791-c703-4c6e-91e5-8f3ab9608277\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.439147 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl4z6\" (UniqueName: \"kubernetes.io/projected/d115a791-c703-4c6e-91e5-8f3ab9608277-kube-api-access-pl4z6\") pod \"d115a791-c703-4c6e-91e5-8f3ab9608277\" (UID: \"d115a791-c703-4c6e-91e5-8f3ab9608277\") " Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.440641 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-utilities" (OuterVolumeSpecName: "utilities") pod "d115a791-c703-4c6e-91e5-8f3ab9608277" (UID: "d115a791-c703-4c6e-91e5-8f3ab9608277"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.442685 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d115a791-c703-4c6e-91e5-8f3ab9608277-kube-api-access-pl4z6" (OuterVolumeSpecName: "kube-api-access-pl4z6") pod "d115a791-c703-4c6e-91e5-8f3ab9608277" (UID: "d115a791-c703-4c6e-91e5-8f3ab9608277"). InnerVolumeSpecName "kube-api-access-pl4z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.530294 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.540781 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.540806 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl4z6\" (UniqueName: \"kubernetes.io/projected/d115a791-c703-4c6e-91e5-8f3ab9608277-kube-api-access-pl4z6\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.547126 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d115a791-c703-4c6e-91e5-8f3ab9608277" (UID: "d115a791-c703-4c6e-91e5-8f3ab9608277"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.645782 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d115a791-c703-4c6e-91e5-8f3ab9608277-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.646942 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f5d85f6c-9z7rp"] Feb 19 10:02:02 crc kubenswrapper[4873]: I0219 10:02:02.647146 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" podUID="710b77db-c69e-4428-93f6-7ce8b2c7ee17" containerName="dnsmasq-dns" containerID="cri-o://9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97" gracePeriod=10 Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.064404 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.118263 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmkgp" event={"ID":"d115a791-c703-4c6e-91e5-8f3ab9608277","Type":"ContainerDied","Data":"164021a9f61ae1ea4080d3f61899481761a6227bd5066ab4023318b78119f680"} Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.118309 4873 scope.go:117] "RemoveContainer" containerID="a571f36e40c7ad937de6c80c4fe2960d3a56ca51d705e8212a16e2333ee169bb" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.118427 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmkgp" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.137820 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.137836 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" event={"ID":"710b77db-c69e-4428-93f6-7ce8b2c7ee17","Type":"ContainerDied","Data":"9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97"} Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.137770 4873 generic.go:334] "Generic (PLEG): container finished" podID="710b77db-c69e-4428-93f6-7ce8b2c7ee17" containerID="9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97" exitCode=0 Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.138006 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f5d85f6c-9z7rp" event={"ID":"710b77db-c69e-4428-93f6-7ce8b2c7ee17","Type":"ContainerDied","Data":"d5213d8f776a516eb0ebc1bff77eadf707410bc2d3c6d133cd538660a60a385d"} Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.148735 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-86n9s" event={"ID":"735c003d-082d-431f-9906-20c8946f1bf4","Type":"ContainerStarted","Data":"d5e694f492487d42cde9591670ae968a23c3523d631e5fc12b8c943bcbf3ca29"} Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.149794 4873 scope.go:117] "RemoveContainer" containerID="ddf2f2a59be91b05f4a24a87c32978d8341502e53c5702e5bb9180a41d34ff6f" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.154238 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrcpc\" (UniqueName: \"kubernetes.io/projected/710b77db-c69e-4428-93f6-7ce8b2c7ee17-kube-api-access-qrcpc\") pod \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.154296 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-nb\") pod \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.154318 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-dns-svc\") pod \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.154341 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-config\") pod \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.154418 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-sb\") pod \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\" (UID: \"710b77db-c69e-4428-93f6-7ce8b2c7ee17\") " Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.156661 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d1070e0c-7518-4d1b-bbb8-e56db1cad28a","Type":"ContainerStarted","Data":"527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f"} Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.158322 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-k6j2h" event={"ID":"a075072a-1153-4963-91c7-e9e2aa08f988","Type":"ContainerStarted","Data":"f7fcab32e5de37523d8bdcbeaad1ae0eeef4c93525ac32d44d6c45730e393e7a"} Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.164983 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmkgp"] Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.177498 4873 scope.go:117] "RemoveContainer" containerID="c3c32c24ff9ddc9c878bf60c4e06dc7e24a6feab7886836d8ecf2510f7a2f602" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.179209 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lmkgp"] Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.182331 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/710b77db-c69e-4428-93f6-7ce8b2c7ee17-kube-api-access-qrcpc" (OuterVolumeSpecName: "kube-api-access-qrcpc") pod "710b77db-c69e-4428-93f6-7ce8b2c7ee17" (UID: "710b77db-c69e-4428-93f6-7ce8b2c7ee17"). InnerVolumeSpecName "kube-api-access-qrcpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.184530 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-86n9s" podStartSLOduration=8.581058443 podStartE2EDuration="18.184498705s" podCreationTimestamp="2026-02-19 10:01:45 +0000 UTC" firstStartedPulling="2026-02-19 10:01:52.633198018 +0000 UTC m=+1021.922629666" lastFinishedPulling="2026-02-19 10:02:02.23663829 +0000 UTC m=+1031.526069928" observedRunningTime="2026-02-19 10:02:03.169523104 +0000 UTC m=+1032.458954742" watchObservedRunningTime="2026-02-19 10:02:03.184498705 +0000 UTC m=+1032.473930343" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.191885 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-k6j2h" podStartSLOduration=7.536493367 podStartE2EDuration="17.191868698s" podCreationTimestamp="2026-02-19 10:01:46 +0000 UTC" firstStartedPulling="2026-02-19 10:01:52.643208136 +0000 UTC m=+1021.932639764" lastFinishedPulling="2026-02-19 10:02:02.298583457 +0000 UTC m=+1031.588015095" observedRunningTime="2026-02-19 10:02:03.187945961 +0000 UTC m=+1032.477377599" watchObservedRunningTime="2026-02-19 10:02:03.191868698 +0000 UTC m=+1032.481300336" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.200842 4873 scope.go:117] "RemoveContainer" containerID="9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.211955 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "710b77db-c69e-4428-93f6-7ce8b2c7ee17" (UID: "710b77db-c69e-4428-93f6-7ce8b2c7ee17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.216341 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "710b77db-c69e-4428-93f6-7ce8b2c7ee17" (UID: "710b77db-c69e-4428-93f6-7ce8b2c7ee17"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.223331 4873 scope.go:117] "RemoveContainer" containerID="84d6805fa8e8d8d98a61caa72fabcd292a85d006dfab20b6192c2c1765a01aa4" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.226064 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "710b77db-c69e-4428-93f6-7ce8b2c7ee17" (UID: "710b77db-c69e-4428-93f6-7ce8b2c7ee17"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.237715 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-config" (OuterVolumeSpecName: "config") pod "710b77db-c69e-4428-93f6-7ce8b2c7ee17" (UID: "710b77db-c69e-4428-93f6-7ce8b2c7ee17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.245872 4873 scope.go:117] "RemoveContainer" containerID="9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97" Feb 19 10:02:03 crc kubenswrapper[4873]: E0219 10:02:03.246495 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97\": container with ID starting with 9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97 not found: ID does not exist" containerID="9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.246584 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97"} err="failed to get container status \"9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97\": rpc error: code = NotFound desc = could not find container \"9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97\": container with ID starting with 9b82833ea4c40ca9217967e3dd7ede76510477fd6b291d755ae79e49cbe06c97 not found: ID does not exist" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.246618 4873 scope.go:117] "RemoveContainer" containerID="84d6805fa8e8d8d98a61caa72fabcd292a85d006dfab20b6192c2c1765a01aa4" Feb 19 10:02:03 crc kubenswrapper[4873]: E0219 10:02:03.246926 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d6805fa8e8d8d98a61caa72fabcd292a85d006dfab20b6192c2c1765a01aa4\": container with ID starting with 84d6805fa8e8d8d98a61caa72fabcd292a85d006dfab20b6192c2c1765a01aa4 not found: ID does not exist" containerID="84d6805fa8e8d8d98a61caa72fabcd292a85d006dfab20b6192c2c1765a01aa4" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.246962 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d6805fa8e8d8d98a61caa72fabcd292a85d006dfab20b6192c2c1765a01aa4"} err="failed to get container status \"84d6805fa8e8d8d98a61caa72fabcd292a85d006dfab20b6192c2c1765a01aa4\": rpc error: code = NotFound desc = could not find container \"84d6805fa8e8d8d98a61caa72fabcd292a85d006dfab20b6192c2c1765a01aa4\": container with ID starting with 84d6805fa8e8d8d98a61caa72fabcd292a85d006dfab20b6192c2c1765a01aa4 not found: ID does not exist" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.256528 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrcpc\" (UniqueName: \"kubernetes.io/projected/710b77db-c69e-4428-93f6-7ce8b2c7ee17-kube-api-access-qrcpc\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.256558 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.256569 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.256578 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.256588 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/710b77db-c69e-4428-93f6-7ce8b2c7ee17-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.472329 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f5d85f6c-9z7rp"] Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.482203 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f5d85f6c-9z7rp"] Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.514213 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="710b77db-c69e-4428-93f6-7ce8b2c7ee17" path="/var/lib/kubelet/pods/710b77db-c69e-4428-93f6-7ce8b2c7ee17/volumes" Feb 19 10:02:03 crc kubenswrapper[4873]: I0219 10:02:03.514789 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" path="/var/lib/kubelet/pods/d115a791-c703-4c6e-91e5-8f3ab9608277/volumes" Feb 19 10:02:06 crc kubenswrapper[4873]: I0219 10:02:06.188178 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d1070e0c-7518-4d1b-bbb8-e56db1cad28a","Type":"ContainerStarted","Data":"e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573"} Feb 19 10:02:06 crc kubenswrapper[4873]: I0219 10:02:06.188505 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d1070e0c-7518-4d1b-bbb8-e56db1cad28a","Type":"ContainerStarted","Data":"92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980"} Feb 19 10:02:06 crc kubenswrapper[4873]: I0219 10:02:06.216711 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=25.216693141 podStartE2EDuration="25.216693141s" podCreationTimestamp="2026-02-19 10:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:06.214327432 +0000 UTC m=+1035.503759080" watchObservedRunningTime="2026-02-19 10:02:06.216693141 +0000 UTC m=+1035.506124789" Feb 19 10:02:07 crc kubenswrapper[4873]: I0219 10:02:07.199075 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9472r" event={"ID":"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba","Type":"ContainerStarted","Data":"e5eec3e87329724888651bc35b53713711df95ef48801142ac4dd2488284d91d"} Feb 19 10:02:07 crc kubenswrapper[4873]: I0219 10:02:07.201990 4873 generic.go:334] "Generic (PLEG): container finished" podID="a075072a-1153-4963-91c7-e9e2aa08f988" containerID="f7fcab32e5de37523d8bdcbeaad1ae0eeef4c93525ac32d44d6c45730e393e7a" exitCode=0 Feb 19 10:02:07 crc kubenswrapper[4873]: I0219 10:02:07.202097 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-k6j2h" event={"ID":"a075072a-1153-4963-91c7-e9e2aa08f988","Type":"ContainerDied","Data":"f7fcab32e5de37523d8bdcbeaad1ae0eeef4c93525ac32d44d6c45730e393e7a"} Feb 19 10:02:07 crc kubenswrapper[4873]: I0219 10:02:07.226045 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9472r" podStartSLOduration=3.616256544 podStartE2EDuration="35.226024212s" podCreationTimestamp="2026-02-19 10:01:32 +0000 UTC" firstStartedPulling="2026-02-19 10:01:33.970057364 +0000 UTC m=+1003.259489002" lastFinishedPulling="2026-02-19 10:02:05.579825032 +0000 UTC m=+1034.869256670" observedRunningTime="2026-02-19 10:02:07.21668883 +0000 UTC m=+1036.506120478" watchObservedRunningTime="2026-02-19 10:02:07.226024212 +0000 UTC m=+1036.515455860" Feb 19 10:02:07 crc kubenswrapper[4873]: I0219 10:02:07.494799 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.211760 4873 generic.go:334] "Generic (PLEG): container finished" podID="735c003d-082d-431f-9906-20c8946f1bf4" containerID="d5e694f492487d42cde9591670ae968a23c3523d631e5fc12b8c943bcbf3ca29" exitCode=0 Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.212311 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-86n9s" event={"ID":"735c003d-082d-431f-9906-20c8946f1bf4","Type":"ContainerDied","Data":"d5e694f492487d42cde9591670ae968a23c3523d631e5fc12b8c943bcbf3ca29"} Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.478918 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.550771 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-combined-ca-bundle\") pod \"a075072a-1153-4963-91c7-e9e2aa08f988\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.550866 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t66wv\" (UniqueName: \"kubernetes.io/projected/a075072a-1153-4963-91c7-e9e2aa08f988-kube-api-access-t66wv\") pod \"a075072a-1153-4963-91c7-e9e2aa08f988\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.550902 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-db-sync-config-data\") pod \"a075072a-1153-4963-91c7-e9e2aa08f988\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.550981 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-config-data\") pod \"a075072a-1153-4963-91c7-e9e2aa08f988\" (UID: \"a075072a-1153-4963-91c7-e9e2aa08f988\") " Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.563767 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a075072a-1153-4963-91c7-e9e2aa08f988" (UID: "a075072a-1153-4963-91c7-e9e2aa08f988"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.563767 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a075072a-1153-4963-91c7-e9e2aa08f988-kube-api-access-t66wv" (OuterVolumeSpecName: "kube-api-access-t66wv") pod "a075072a-1153-4963-91c7-e9e2aa08f988" (UID: "a075072a-1153-4963-91c7-e9e2aa08f988"). InnerVolumeSpecName "kube-api-access-t66wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.589028 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a075072a-1153-4963-91c7-e9e2aa08f988" (UID: "a075072a-1153-4963-91c7-e9e2aa08f988"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.597816 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-config-data" (OuterVolumeSpecName: "config-data") pod "a075072a-1153-4963-91c7-e9e2aa08f988" (UID: "a075072a-1153-4963-91c7-e9e2aa08f988"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.654661 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t66wv\" (UniqueName: \"kubernetes.io/projected/a075072a-1153-4963-91c7-e9e2aa08f988-kube-api-access-t66wv\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.654695 4873 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.654705 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:08 crc kubenswrapper[4873]: I0219 10:02:08.654713 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a075072a-1153-4963-91c7-e9e2aa08f988-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.223409 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-k6j2h" event={"ID":"a075072a-1153-4963-91c7-e9e2aa08f988","Type":"ContainerDied","Data":"5743e2884f1a9a199fc04d17b81aca13969dd3eeccc42a1ad3851609621f80db"} Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.223455 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5743e2884f1a9a199fc04d17b81aca13969dd3eeccc42a1ad3851609621f80db" Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.224257 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-k6j2h" Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.547161 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-86n9s" Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.671804 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-combined-ca-bundle\") pod \"735c003d-082d-431f-9906-20c8946f1bf4\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.671875 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4qhl\" (UniqueName: \"kubernetes.io/projected/735c003d-082d-431f-9906-20c8946f1bf4-kube-api-access-p4qhl\") pod \"735c003d-082d-431f-9906-20c8946f1bf4\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.671917 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-config-data\") pod \"735c003d-082d-431f-9906-20c8946f1bf4\" (UID: \"735c003d-082d-431f-9906-20c8946f1bf4\") " Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.693373 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/735c003d-082d-431f-9906-20c8946f1bf4-kube-api-access-p4qhl" (OuterVolumeSpecName: "kube-api-access-p4qhl") pod "735c003d-082d-431f-9906-20c8946f1bf4" (UID: "735c003d-082d-431f-9906-20c8946f1bf4"). InnerVolumeSpecName "kube-api-access-p4qhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.698367 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "735c003d-082d-431f-9906-20c8946f1bf4" (UID: "735c003d-082d-431f-9906-20c8946f1bf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.721563 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-config-data" (OuterVolumeSpecName: "config-data") pod "735c003d-082d-431f-9906-20c8946f1bf4" (UID: "735c003d-082d-431f-9906-20c8946f1bf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.773891 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.773940 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4qhl\" (UniqueName: \"kubernetes.io/projected/735c003d-082d-431f-9906-20c8946f1bf4-kube-api-access-p4qhl\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:09 crc kubenswrapper[4873]: I0219 10:02:09.773954 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735c003d-082d-431f-9906-20c8946f1bf4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.233195 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-86n9s" event={"ID":"735c003d-082d-431f-9906-20c8946f1bf4","Type":"ContainerDied","Data":"e7a7027c2775e72cf8a30257c067cda81d18ed0d292fcd29a26190fc119c919e"} Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.233233 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-86n9s" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.233235 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7a7027c2775e72cf8a30257c067cda81d18ed0d292fcd29a26190fc119c919e" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.500509 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c85f9c585-fhvwm"] Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.500871 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a075072a-1153-4963-91c7-e9e2aa08f988" containerName="watcher-db-sync" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.500888 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a075072a-1153-4963-91c7-e9e2aa08f988" containerName="watcher-db-sync" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.500906 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f97f25-d006-40d7-a090-ab45ab11b282" containerName="mariadb-database-create" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.500913 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f97f25-d006-40d7-a090-ab45ab11b282" containerName="mariadb-database-create" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.500921 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710b77db-c69e-4428-93f6-7ce8b2c7ee17" containerName="init" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.500927 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="710b77db-c69e-4428-93f6-7ce8b2c7ee17" containerName="init" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.500938 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710b77db-c69e-4428-93f6-7ce8b2c7ee17" containerName="dnsmasq-dns" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.500944 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="710b77db-c69e-4428-93f6-7ce8b2c7ee17" containerName="dnsmasq-dns" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.500952 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735c003d-082d-431f-9906-20c8946f1bf4" containerName="keystone-db-sync" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.500957 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="735c003d-082d-431f-9906-20c8946f1bf4" containerName="keystone-db-sync" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.500965 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679f69ef-9960-4e33-a6aa-09baefabc417" containerName="mariadb-account-create-update" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.500971 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="679f69ef-9960-4e33-a6aa-09baefabc417" containerName="mariadb-account-create-update" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.500980 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec120760-bb10-44ff-bbb0-ed1665b4e17b" containerName="mariadb-database-create" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.500987 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec120760-bb10-44ff-bbb0-ed1665b4e17b" containerName="mariadb-database-create" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.501002 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="registry-server" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501009 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="registry-server" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.501017 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="extract-content" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501023 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="extract-content" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.501030 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c35b26-7dc1-4cea-bbe7-53a9e47df7ba" containerName="mariadb-database-create" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501036 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c35b26-7dc1-4cea-bbe7-53a9e47df7ba" containerName="mariadb-database-create" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.501050 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="extract-utilities" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501056 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="extract-utilities" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.501067 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0daf0d-c150-49de-98af-3f65dd78112f" containerName="mariadb-account-create-update" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501073 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0daf0d-c150-49de-98af-3f65dd78112f" containerName="mariadb-account-create-update" Feb 19 10:02:10 crc kubenswrapper[4873]: E0219 10:02:10.501083 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7769ae-caf0-4f62-be96-90d6fa334259" containerName="mariadb-account-create-update" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501089 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7769ae-caf0-4f62-be96-90d6fa334259" containerName="mariadb-account-create-update" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501639 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf0daf0d-c150-49de-98af-3f65dd78112f" containerName="mariadb-account-create-update" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501667 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d115a791-c703-4c6e-91e5-8f3ab9608277" containerName="registry-server" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501684 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c35b26-7dc1-4cea-bbe7-53a9e47df7ba" containerName="mariadb-database-create" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501708 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f97f25-d006-40d7-a090-ab45ab11b282" containerName="mariadb-database-create" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501722 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="679f69ef-9960-4e33-a6aa-09baefabc417" containerName="mariadb-account-create-update" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501739 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd7769ae-caf0-4f62-be96-90d6fa334259" containerName="mariadb-account-create-update" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501754 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="735c003d-082d-431f-9906-20c8946f1bf4" containerName="keystone-db-sync" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501772 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a075072a-1153-4963-91c7-e9e2aa08f988" containerName="watcher-db-sync" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501785 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec120760-bb10-44ff-bbb0-ed1665b4e17b" containerName="mariadb-database-create" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.501799 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="710b77db-c69e-4428-93f6-7ce8b2c7ee17" containerName="dnsmasq-dns" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.502740 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.522085 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c85f9c585-fhvwm"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.553790 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nf742"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.554981 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.563282 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.563962 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-27d74" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.566583 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.566620 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.568146 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nf742"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.568501 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.596377 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-config\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.596477 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-svc\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.596556 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tm6v\" (UniqueName: \"kubernetes.io/projected/90450452-854d-4886-b5ce-828f85c3f721-kube-api-access-2tm6v\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.596592 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-sb\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.596638 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-swift-storage-0\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.596664 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-nb\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.658036 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.659213 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.670159 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.674539 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-m755p" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.680174 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.697976 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-swift-storage-0\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.698019 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-nb\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.698051 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-credential-keys\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.698091 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-config\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.698126 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-fernet-keys\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.698163 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-svc\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.698182 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-config-data\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.698197 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-combined-ca-bundle\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.698224 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-scripts\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.698256 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tm6v\" (UniqueName: \"kubernetes.io/projected/90450452-854d-4886-b5ce-828f85c3f721-kube-api-access-2tm6v\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.698280 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-sb\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.698300 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sphnf\" (UniqueName: \"kubernetes.io/projected/f631ba50-5961-428e-83a5-a8ddb50085d3-kube-api-access-sphnf\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.700059 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-swift-storage-0\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.700456 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.701030 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-sb\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.701041 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-svc\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.701484 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-nb\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.701545 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-config\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.702615 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.703843 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.704657 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.705714 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.708682 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.748156 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tm6v\" (UniqueName: \"kubernetes.io/projected/90450452-854d-4886-b5ce-828f85c3f721-kube-api-access-2tm6v\") pod \"dnsmasq-dns-5c85f9c585-fhvwm\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.754168 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.790599 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802179 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802227 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b9v6\" (UniqueName: \"kubernetes.io/projected/95402218-fbbb-4453-aba6-d135ba3a26bd-kube-api-access-4b9v6\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802251 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95402218-fbbb-4453-aba6-d135ba3a26bd-logs\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802272 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-config-data\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802296 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-combined-ca-bundle\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802327 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802397 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-scripts\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802424 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf92r\" (UniqueName: \"kubernetes.io/projected/b8008736-31ec-491c-aa52-03b9413feab9-kube-api-access-cf92r\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802465 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802510 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8008736-31ec-491c-aa52-03b9413feab9-logs\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802553 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bxrn\" (UniqueName: \"kubernetes.io/projected/a575d51d-1ad3-422e-8e7c-b24b2c5de526-kube-api-access-9bxrn\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802581 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-config-data\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802600 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sphnf\" (UniqueName: \"kubernetes.io/projected/f631ba50-5961-428e-83a5-a8ddb50085d3-kube-api-access-sphnf\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802654 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802707 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-credential-keys\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802762 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a575d51d-1ad3-422e-8e7c-b24b2c5de526-logs\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802808 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-fernet-keys\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802826 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-config-data\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802855 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.802870 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.806815 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-config-data\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.821613 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-credential-keys\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.821699 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-combined-ca-bundle\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.822945 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-fernet-keys\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.824509 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.827472 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-98c8c74bf-wsl5f"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.838978 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.839613 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sphnf\" (UniqueName: \"kubernetes.io/projected/f631ba50-5961-428e-83a5-a8ddb50085d3-kube-api-access-sphnf\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.843281 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-scripts\") pod \"keystone-bootstrap-nf742\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.843897 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.844634 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.844841 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-l5tm9" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.845303 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.899012 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-98c8c74bf-wsl5f"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.913718 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c639af02-a4c8-40cf-947e-a50353ab2537-horizon-secret-key\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.913793 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.913820 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf92r\" (UniqueName: \"kubernetes.io/projected/b8008736-31ec-491c-aa52-03b9413feab9-kube-api-access-cf92r\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.913849 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.913885 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8008736-31ec-491c-aa52-03b9413feab9-logs\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.913930 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bxrn\" (UniqueName: \"kubernetes.io/projected/a575d51d-1ad3-422e-8e7c-b24b2c5de526-kube-api-access-9bxrn\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.913955 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-config-data\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.913991 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914028 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-scripts\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914064 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a575d51d-1ad3-422e-8e7c-b24b2c5de526-logs\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914082 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c639af02-a4c8-40cf-947e-a50353ab2537-logs\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914218 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-config-data\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914238 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdfl7\" (UniqueName: \"kubernetes.io/projected/c639af02-a4c8-40cf-947e-a50353ab2537-kube-api-access-pdfl7\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914264 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914280 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914313 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914332 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-config-data\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914358 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b9v6\" (UniqueName: \"kubernetes.io/projected/95402218-fbbb-4453-aba6-d135ba3a26bd-kube-api-access-4b9v6\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914373 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95402218-fbbb-4453-aba6-d135ba3a26bd-logs\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914531 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.914762 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95402218-fbbb-4453-aba6-d135ba3a26bd-logs\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.927767 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a575d51d-1ad3-422e-8e7c-b24b2c5de526-logs\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.941545 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8008736-31ec-491c-aa52-03b9413feab9-logs\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.942000 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.944522 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.949933 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.950474 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-config-data\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.953125 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-config-data\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.969231 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.969697 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.969733 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-gqrb5"] Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.970814 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.974853 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.975079 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tmcc9" Feb 19 10:02:10 crc kubenswrapper[4873]: I0219 10:02:10.978477 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:10.997299 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gqrb5"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:10.997630 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b9v6\" (UniqueName: \"kubernetes.io/projected/95402218-fbbb-4453-aba6-d135ba3a26bd-kube-api-access-4b9v6\") pod \"watcher-applier-0\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " pod="openstack/watcher-applier-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.010527 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.010799 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf92r\" (UniqueName: \"kubernetes.io/projected/b8008736-31ec-491c-aa52-03b9413feab9-kube-api-access-cf92r\") pod \"watcher-decision-engine-0\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.015845 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bxrn\" (UniqueName: \"kubernetes.io/projected/a575d51d-1ad3-422e-8e7c-b24b2c5de526-kube-api-access-9bxrn\") pod \"watcher-api-0\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " pod="openstack/watcher-api-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.016845 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-scripts\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.016885 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c639af02-a4c8-40cf-947e-a50353ab2537-logs\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.016913 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdfl7\" (UniqueName: \"kubernetes.io/projected/c639af02-a4c8-40cf-947e-a50353ab2537-kube-api-access-pdfl7\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.016949 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-config-data\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.016974 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c639af02-a4c8-40cf-947e-a50353ab2537-horizon-secret-key\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.025427 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c639af02-a4c8-40cf-947e-a50353ab2537-logs\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.026034 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-scripts\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.026786 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-config-data\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.030012 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vf762"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.030457 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c639af02-a4c8-40cf-947e-a50353ab2537-horizon-secret-key\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.031470 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vf762" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.039984 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.053039 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.054055 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pk4jm" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.054335 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.054900 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.082544 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdfl7\" (UniqueName: \"kubernetes.io/projected/c639af02-a4c8-40cf-947e-a50353ab2537-kube-api-access-pdfl7\") pod \"horizon-98c8c74bf-wsl5f\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.095418 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vf762"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.118274 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-db-sync-config-data\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.118597 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-combined-ca-bundle\") pod \"neutron-db-sync-vf762\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " pod="openstack/neutron-db-sync-vf762" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.118632 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-scripts\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.118667 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-combined-ca-bundle\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.118777 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-config\") pod \"neutron-db-sync-vf762\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " pod="openstack/neutron-db-sync-vf762" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.118805 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48k6w\" (UniqueName: \"kubernetes.io/projected/99868e3f-82d7-4f0c-9056-661e95486e6e-kube-api-access-48k6w\") pod \"neutron-db-sync-vf762\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " pod="openstack/neutron-db-sync-vf762" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.118838 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce5accb4-1da0-4a21-a289-7dba33ad935f-etc-machine-id\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.118860 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-config-data\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.118906 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7ttv\" (UniqueName: \"kubernetes.io/projected/ce5accb4-1da0-4a21-a289-7dba33ad935f-kube-api-access-k7ttv\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.133266 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-4pv5z"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.134368 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.141146 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4pv5z"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.141472 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.141979 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-t72rv" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.149192 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c85f9c585-fhvwm"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.181382 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-99d6b5b4f-2j7fk"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.182912 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.220406 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.222602 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-db-sync-config-data\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.222646 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-combined-ca-bundle\") pod \"neutron-db-sync-vf762\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " pod="openstack/neutron-db-sync-vf762" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.222683 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-scripts\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.222714 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-combined-ca-bundle\") pod \"barbican-db-sync-4pv5z\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.222754 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-combined-ca-bundle\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.222814 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-db-sync-config-data\") pod \"barbican-db-sync-4pv5z\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.222931 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78b5v\" (UniqueName: \"kubernetes.io/projected/943d069e-6ad4-4411-b937-c4499f0ced6f-kube-api-access-78b5v\") pod \"barbican-db-sync-4pv5z\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.222961 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-config\") pod \"neutron-db-sync-vf762\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " pod="openstack/neutron-db-sync-vf762" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.223011 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48k6w\" (UniqueName: \"kubernetes.io/projected/99868e3f-82d7-4f0c-9056-661e95486e6e-kube-api-access-48k6w\") pod \"neutron-db-sync-vf762\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " pod="openstack/neutron-db-sync-vf762" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.223046 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce5accb4-1da0-4a21-a289-7dba33ad935f-etc-machine-id\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.223067 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-config-data\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.223486 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7ttv\" (UniqueName: \"kubernetes.io/projected/ce5accb4-1da0-4a21-a289-7dba33ad935f-kube-api-access-k7ttv\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.244662 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce5accb4-1da0-4a21-a289-7dba33ad935f-etc-machine-id\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.256424 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-combined-ca-bundle\") pod \"neutron-db-sync-vf762\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " pod="openstack/neutron-db-sync-vf762" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.256898 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-db-sync-config-data\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.262318 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7ttv\" (UniqueName: \"kubernetes.io/projected/ce5accb4-1da0-4a21-a289-7dba33ad935f-kube-api-access-k7ttv\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.264732 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-scripts\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.265660 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48k6w\" (UniqueName: \"kubernetes.io/projected/99868e3f-82d7-4f0c-9056-661e95486e6e-kube-api-access-48k6w\") pod \"neutron-db-sync-vf762\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " pod="openstack/neutron-db-sync-vf762" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.269919 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-config\") pod \"neutron-db-sync-vf762\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " pod="openstack/neutron-db-sync-vf762" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.278902 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vf762" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.290205 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-combined-ca-bundle\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.290836 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.302294 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-config-data\") pod \"cinder-db-sync-gqrb5\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.335501 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79f476f4fc-dsgbh"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.357498 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.407176 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-db-sync-config-data\") pod \"barbican-db-sync-4pv5z\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.407244 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-nb\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.407366 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78b5v\" (UniqueName: \"kubernetes.io/projected/943d069e-6ad4-4411-b937-c4499f0ced6f-kube-api-access-78b5v\") pod \"barbican-db-sync-4pv5z\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.407444 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vchwr\" (UniqueName: \"kubernetes.io/projected/402372ed-3c0d-4d12-a4f5-bbd82024a08d-kube-api-access-vchwr\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.407484 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-config\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.407530 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-sb\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.407647 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-svc\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.407687 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-combined-ca-bundle\") pod \"barbican-db-sync-4pv5z\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.407712 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-swift-storage-0\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.426250 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-99d6b5b4f-2j7fk"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.434763 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-combined-ca-bundle\") pod \"barbican-db-sync-4pv5z\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.441752 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-db-sync-config-data\") pod \"barbican-db-sync-4pv5z\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.467999 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78b5v\" (UniqueName: \"kubernetes.io/projected/943d069e-6ad4-4411-b937-c4499f0ced6f-kube-api-access-78b5v\") pod \"barbican-db-sync-4pv5z\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.514805 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-svc\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.515134 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-config-data\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.515237 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-logs\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.515312 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-swift-storage-0\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.515398 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-nb\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.515478 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-scripts\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.515556 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf7h7\" (UniqueName: \"kubernetes.io/projected/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-kube-api-access-xf7h7\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.515662 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vchwr\" (UniqueName: \"kubernetes.io/projected/402372ed-3c0d-4d12-a4f5-bbd82024a08d-kube-api-access-vchwr\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.515727 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-config\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.515804 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-horizon-secret-key\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.515881 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-sb\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.516916 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-sb\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.517588 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-svc\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.518924 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-swift-storage-0\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.519564 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-nb\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.521401 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-config\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.559377 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79f476f4fc-dsgbh"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.559755 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-98gbw"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.563895 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.564460 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-98gbw"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.564574 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.565470 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vchwr\" (UniqueName: \"kubernetes.io/projected/402372ed-3c0d-4d12-a4f5-bbd82024a08d-kube-api-access-vchwr\") pod \"dnsmasq-dns-99d6b5b4f-2j7fk\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.577456 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.581146 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.583363 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-x7shj" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.617568 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-horizon-secret-key\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.622598 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-config-data\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.628839 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-logs\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.629316 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-scripts\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.629473 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf7h7\" (UniqueName: \"kubernetes.io/projected/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-kube-api-access-xf7h7\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.637204 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-config-data\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.638473 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-logs\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.640303 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-scripts\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.652138 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.657014 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-horizon-secret-key\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.664809 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.670846 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.676856 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.677097 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.677714 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf7h7\" (UniqueName: \"kubernetes.io/projected/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-kube-api-access-xf7h7\") pod \"horizon-79f476f4fc-dsgbh\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.704826 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.740452 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-logs\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.741395 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-combined-ca-bundle\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.742080 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-config-data\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.743051 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-scripts\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.743692 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9t4k\" (UniqueName: \"kubernetes.io/projected/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-kube-api-access-w9t4k\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.760437 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.790745 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.846272 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nf742"] Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849192 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849268 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-config-data\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849286 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-scripts\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849308 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849342 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-log-httpd\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849373 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-run-httpd\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849400 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-config-data\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849428 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-scripts\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849490 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9t4k\" (UniqueName: \"kubernetes.io/projected/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-kube-api-access-w9t4k\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849548 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-logs\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849606 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-combined-ca-bundle\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.849627 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xxqg\" (UniqueName: \"kubernetes.io/projected/ab448dfd-a67c-49b5-a153-92a5a6f504b2-kube-api-access-4xxqg\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.862687 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-scripts\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.864260 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-config-data\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.882425 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-logs\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.890772 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-combined-ca-bundle\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.893516 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9t4k\" (UniqueName: \"kubernetes.io/projected/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-kube-api-access-w9t4k\") pod \"placement-db-sync-98gbw\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.958205 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xxqg\" (UniqueName: \"kubernetes.io/projected/ab448dfd-a67c-49b5-a153-92a5a6f504b2-kube-api-access-4xxqg\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.958263 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.958303 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-scripts\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.958323 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.958357 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-log-httpd\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.958390 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-run-httpd\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.958422 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-config-data\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.959569 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-log-httpd\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.959797 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-run-httpd\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.967173 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.974146 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-scripts\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.977174 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.979072 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-config-data\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.989462 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:11 crc kubenswrapper[4873]: I0219 10:02:11.993918 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xxqg\" (UniqueName: \"kubernetes.io/projected/ab448dfd-a67c-49b5-a153-92a5a6f504b2-kube-api-access-4xxqg\") pod \"ceilometer-0\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " pod="openstack/ceilometer-0" Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.000519 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c85f9c585-fhvwm"] Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.026684 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.287697 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nf742" event={"ID":"f631ba50-5961-428e-83a5-a8ddb50085d3","Type":"ContainerStarted","Data":"f5279a2fd1b4d18b8c04c2e7d237f62d2fc966d80132b8b737d8df303a78f856"} Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.301636 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" event={"ID":"90450452-854d-4886-b5ce-828f85c3f721","Type":"ContainerStarted","Data":"d4676ca17a3f8486e0adc6dd7b1d20b3b030b6852dc5274a36cbf6268c9cc2f3"} Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.420573 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vf762"] Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.472638 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.491485 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 10:02:12 crc kubenswrapper[4873]: W0219 10:02:12.495443 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda575d51d_1ad3_422e_8e7c_b24b2c5de526.slice/crio-581b3a000070ae451f6e9ef110e53c2e98989bd07a9fba9a09c69f4b1ecfba88 WatchSource:0}: Error finding container 581b3a000070ae451f6e9ef110e53c2e98989bd07a9fba9a09c69f4b1ecfba88: Status 404 returned error can't find the container with id 581b3a000070ae451f6e9ef110e53c2e98989bd07a9fba9a09c69f4b1ecfba88 Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.502730 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.720092 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gqrb5"] Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.731269 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.741562 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-98c8c74bf-wsl5f"] Feb 19 10:02:12 crc kubenswrapper[4873]: W0219 10:02:12.750341 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc639af02_a4c8_40cf_947e_a50353ab2537.slice/crio-9eac663eb58aa13e3523f32d4bcb37aa001e4ee953f53b136a077545dfbf1008 WatchSource:0}: Error finding container 9eac663eb58aa13e3523f32d4bcb37aa001e4ee953f53b136a077545dfbf1008: Status 404 returned error can't find the container with id 9eac663eb58aa13e3523f32d4bcb37aa001e4ee953f53b136a077545dfbf1008 Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.775016 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.968220 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4pv5z"] Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.977427 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-99d6b5b4f-2j7fk"] Feb 19 10:02:12 crc kubenswrapper[4873]: W0219 10:02:12.983687 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod402372ed_3c0d_4d12_a4f5_bbd82024a08d.slice/crio-676e73049674f4406ff08bc00e5c61a6ce15ece9c685ed8a76d9fda336863789 WatchSource:0}: Error finding container 676e73049674f4406ff08bc00e5c61a6ce15ece9c685ed8a76d9fda336863789: Status 404 returned error can't find the container with id 676e73049674f4406ff08bc00e5c61a6ce15ece9c685ed8a76d9fda336863789 Feb 19 10:02:12 crc kubenswrapper[4873]: I0219 10:02:12.984746 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79f476f4fc-dsgbh"] Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.229592 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-98gbw"] Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.259661 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.329422 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gqrb5" event={"ID":"ce5accb4-1da0-4a21-a289-7dba33ad935f","Type":"ContainerStarted","Data":"7091928b68df42cd9ae5c284cfdb9622dc758710a4af850abe1bece12bfc74a3"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.332621 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a575d51d-1ad3-422e-8e7c-b24b2c5de526","Type":"ContainerStarted","Data":"0782a10551b13dd61abc8e02874c936098093870d1d93858b972201b9cd3c7da"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.332875 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a575d51d-1ad3-422e-8e7c-b24b2c5de526","Type":"ContainerStarted","Data":"e625952875a51eb5caf68d1f4611160ae7316ded4cd569ca37c367b0a5f6884b"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.332891 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a575d51d-1ad3-422e-8e7c-b24b2c5de526","Type":"ContainerStarted","Data":"581b3a000070ae451f6e9ef110e53c2e98989bd07a9fba9a09c69f4b1ecfba88"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.334566 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.340213 4873 generic.go:334] "Generic (PLEG): container finished" podID="90450452-854d-4886-b5ce-828f85c3f721" containerID="da13141d440c09b143067199911d8baeb575411d93028b4a5a0e7f29369df1da" exitCode=0 Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.340307 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" event={"ID":"90450452-854d-4886-b5ce-828f85c3f721","Type":"ContainerDied","Data":"da13141d440c09b143067199911d8baeb575411d93028b4a5a0e7f29369df1da"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.344210 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": dial tcp 10.217.0.151:9322: connect: connection refused" Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.355908 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.355889357 podStartE2EDuration="3.355889357s" podCreationTimestamp="2026-02-19 10:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:13.353777434 +0000 UTC m=+1042.643209072" watchObservedRunningTime="2026-02-19 10:02:13.355889357 +0000 UTC m=+1042.645321015" Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.357595 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" event={"ID":"402372ed-3c0d-4d12-a4f5-bbd82024a08d","Type":"ContainerStarted","Data":"676e73049674f4406ff08bc00e5c61a6ce15ece9c685ed8a76d9fda336863789"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.372585 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab448dfd-a67c-49b5-a153-92a5a6f504b2","Type":"ContainerStarted","Data":"c341b58fa66a9c7c1455f8e33fdfb22dd5f6b0a9b06cdb661264c78977069ea2"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.383471 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-98c8c74bf-wsl5f" event={"ID":"c639af02-a4c8-40cf-947e-a50353ab2537","Type":"ContainerStarted","Data":"9eac663eb58aa13e3523f32d4bcb37aa001e4ee953f53b136a077545dfbf1008"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.393499 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"95402218-fbbb-4453-aba6-d135ba3a26bd","Type":"ContainerStarted","Data":"db54f1f84baadc90d4260a330bf2f720ca4cd24fbaeefa2f3e5f18f033d41844"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.406045 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-98gbw" event={"ID":"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6","Type":"ContainerStarted","Data":"37b0ac3d48e8bec4044f6f8f22d9abb8c79cc58cedf7d1bbf6b0fb89fcc2a84a"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.429009 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b8008736-31ec-491c-aa52-03b9413feab9","Type":"ContainerStarted","Data":"a2ea00441668ccb8b861c9df038d7b7675d321f71184b1fb21464b11d16d7eef"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.431370 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f476f4fc-dsgbh" event={"ID":"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d","Type":"ContainerStarted","Data":"60cadfc4a70dc8e7874b849ac63b6a6b2ae5cbe2b77781abafab78ab09e65314"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.433815 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4pv5z" event={"ID":"943d069e-6ad4-4411-b937-c4499f0ced6f","Type":"ContainerStarted","Data":"db07d7286194b278e2cec929f66edc47c3ebbe39668738c5d112ac9e99a6a103"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.447887 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vf762" event={"ID":"99868e3f-82d7-4f0c-9056-661e95486e6e","Type":"ContainerStarted","Data":"22b91ea45d57e3f8ed16da3e5a4058c15af39a6d914075c4521ba6755b03990b"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.447934 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vf762" event={"ID":"99868e3f-82d7-4f0c-9056-661e95486e6e","Type":"ContainerStarted","Data":"51130049c4f72c45b52b368bcf10130af9e763c98f2e5fc842a0ae20064148f7"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.451309 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nf742" event={"ID":"f631ba50-5961-428e-83a5-a8ddb50085d3","Type":"ContainerStarted","Data":"dfe8a7cf5aeabc3bd0899d011af3258a91fc6d795682d82ec9f1f9c569448452"} Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.459331 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.465296 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vf762" podStartSLOduration=3.465280451 podStartE2EDuration="3.465280451s" podCreationTimestamp="2026-02-19 10:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:13.464976033 +0000 UTC m=+1042.754407671" watchObservedRunningTime="2026-02-19 10:02:13.465280451 +0000 UTC m=+1042.754712089" Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.549915 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nf742" podStartSLOduration=3.54989385 podStartE2EDuration="3.54989385s" podCreationTimestamp="2026-02-19 10:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:13.542474946 +0000 UTC m=+1042.831906574" watchObservedRunningTime="2026-02-19 10:02:13.54989385 +0000 UTC m=+1042.839325488" Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.762914 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-98c8c74bf-wsl5f"] Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.834939 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.891909 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.949148 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.951212 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76bfd776d9-fdg7f"] Feb 19 10:02:13 crc kubenswrapper[4873]: E0219 10:02:13.951517 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90450452-854d-4886-b5ce-828f85c3f721" containerName="init" Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.951532 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="90450452-854d-4886-b5ce-828f85c3f721" containerName="init" Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.951718 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="90450452-854d-4886-b5ce-828f85c3f721" containerName="init" Feb 19 10:02:13 crc kubenswrapper[4873]: I0219 10:02:13.952589 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.049745 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76bfd776d9-fdg7f"] Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.082635 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-swift-storage-0\") pod \"90450452-854d-4886-b5ce-828f85c3f721\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.082685 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-svc\") pod \"90450452-854d-4886-b5ce-828f85c3f721\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.082704 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-config\") pod \"90450452-854d-4886-b5ce-828f85c3f721\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.082838 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-sb\") pod \"90450452-854d-4886-b5ce-828f85c3f721\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.082856 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-nb\") pod \"90450452-854d-4886-b5ce-828f85c3f721\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.082912 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tm6v\" (UniqueName: \"kubernetes.io/projected/90450452-854d-4886-b5ce-828f85c3f721-kube-api-access-2tm6v\") pod \"90450452-854d-4886-b5ce-828f85c3f721\" (UID: \"90450452-854d-4886-b5ce-828f85c3f721\") " Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.083116 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vfmw\" (UniqueName: \"kubernetes.io/projected/6827937b-ebcc-45a6-98e3-08d49115503b-kube-api-access-7vfmw\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.083152 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-scripts\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.083186 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6827937b-ebcc-45a6-98e3-08d49115503b-logs\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.083204 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6827937b-ebcc-45a6-98e3-08d49115503b-horizon-secret-key\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.083232 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-config-data\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.111256 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90450452-854d-4886-b5ce-828f85c3f721-kube-api-access-2tm6v" (OuterVolumeSpecName: "kube-api-access-2tm6v") pod "90450452-854d-4886-b5ce-828f85c3f721" (UID: "90450452-854d-4886-b5ce-828f85c3f721"). InnerVolumeSpecName "kube-api-access-2tm6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.118861 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "90450452-854d-4886-b5ce-828f85c3f721" (UID: "90450452-854d-4886-b5ce-828f85c3f721"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.136911 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-config" (OuterVolumeSpecName: "config") pod "90450452-854d-4886-b5ce-828f85c3f721" (UID: "90450452-854d-4886-b5ce-828f85c3f721"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.186383 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6827937b-ebcc-45a6-98e3-08d49115503b-logs\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.186434 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6827937b-ebcc-45a6-98e3-08d49115503b-horizon-secret-key\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.186481 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-config-data\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.186611 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vfmw\" (UniqueName: \"kubernetes.io/projected/6827937b-ebcc-45a6-98e3-08d49115503b-kube-api-access-7vfmw\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.186646 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-scripts\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.186717 4873 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.186731 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.186743 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tm6v\" (UniqueName: \"kubernetes.io/projected/90450452-854d-4886-b5ce-828f85c3f721-kube-api-access-2tm6v\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.187454 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-scripts\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.188451 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-config-data\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.188819 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6827937b-ebcc-45a6-98e3-08d49115503b-logs\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.204837 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6827937b-ebcc-45a6-98e3-08d49115503b-horizon-secret-key\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.205170 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90450452-854d-4886-b5ce-828f85c3f721" (UID: "90450452-854d-4886-b5ce-828f85c3f721"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.254093 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90450452-854d-4886-b5ce-828f85c3f721" (UID: "90450452-854d-4886-b5ce-828f85c3f721"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.268762 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vfmw\" (UniqueName: \"kubernetes.io/projected/6827937b-ebcc-45a6-98e3-08d49115503b-kube-api-access-7vfmw\") pod \"horizon-76bfd776d9-fdg7f\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.269451 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90450452-854d-4886-b5ce-828f85c3f721" (UID: "90450452-854d-4886-b5ce-828f85c3f721"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.291821 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.292793 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.292818 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.292827 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90450452-854d-4886-b5ce-828f85c3f721-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.476142 4873 generic.go:334] "Generic (PLEG): container finished" podID="402372ed-3c0d-4d12-a4f5-bbd82024a08d" containerID="0caa3e8656105d67ae98953c7c54ce1e536f9e27f2d0305163026fbf53ca79e0" exitCode=0 Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.477328 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" event={"ID":"402372ed-3c0d-4d12-a4f5-bbd82024a08d","Type":"ContainerDied","Data":"0caa3e8656105d67ae98953c7c54ce1e536f9e27f2d0305163026fbf53ca79e0"} Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.481907 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" event={"ID":"90450452-854d-4886-b5ce-828f85c3f721","Type":"ContainerDied","Data":"d4676ca17a3f8486e0adc6dd7b1d20b3b030b6852dc5274a36cbf6268c9cc2f3"} Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.481940 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c85f9c585-fhvwm" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.481951 4873 scope.go:117] "RemoveContainer" containerID="da13141d440c09b143067199911d8baeb575411d93028b4a5a0e7f29369df1da" Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.662623 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c85f9c585-fhvwm"] Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.670572 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c85f9c585-fhvwm"] Feb 19 10:02:14 crc kubenswrapper[4873]: I0219 10:02:14.900001 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76bfd776d9-fdg7f"] Feb 19 10:02:14 crc kubenswrapper[4873]: W0219 10:02:14.912943 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6827937b_ebcc_45a6_98e3_08d49115503b.slice/crio-3316c81d04a3ec6d98aa8cad078c0b0a6499f0b69e6af433f3ff3dc9ecbf7c2c WatchSource:0}: Error finding container 3316c81d04a3ec6d98aa8cad078c0b0a6499f0b69e6af433f3ff3dc9ecbf7c2c: Status 404 returned error can't find the container with id 3316c81d04a3ec6d98aa8cad078c0b0a6499f0b69e6af433f3ff3dc9ecbf7c2c Feb 19 10:02:15 crc kubenswrapper[4873]: I0219 10:02:15.511020 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api" containerID="cri-o://0782a10551b13dd61abc8e02874c936098093870d1d93858b972201b9cd3c7da" gracePeriod=30 Feb 19 10:02:15 crc kubenswrapper[4873]: I0219 10:02:15.511008 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api-log" containerID="cri-o://e625952875a51eb5caf68d1f4611160ae7316ded4cd569ca37c367b0a5f6884b" gracePeriod=30 Feb 19 10:02:15 crc kubenswrapper[4873]: I0219 10:02:15.513091 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90450452-854d-4886-b5ce-828f85c3f721" path="/var/lib/kubelet/pods/90450452-854d-4886-b5ce-828f85c3f721/volumes" Feb 19 10:02:15 crc kubenswrapper[4873]: I0219 10:02:15.514061 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:15 crc kubenswrapper[4873]: I0219 10:02:15.514095 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" event={"ID":"402372ed-3c0d-4d12-a4f5-bbd82024a08d","Type":"ContainerStarted","Data":"7a12463c2cf197b1f920440df50985d94ae3e7a22c56ad882d01bc741d80d703"} Feb 19 10:02:15 crc kubenswrapper[4873]: I0219 10:02:15.514130 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bfd776d9-fdg7f" event={"ID":"6827937b-ebcc-45a6-98e3-08d49115503b","Type":"ContainerStarted","Data":"3316c81d04a3ec6d98aa8cad078c0b0a6499f0b69e6af433f3ff3dc9ecbf7c2c"} Feb 19 10:02:15 crc kubenswrapper[4873]: I0219 10:02:15.534839 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" podStartSLOduration=4.534817605 podStartE2EDuration="4.534817605s" podCreationTimestamp="2026-02-19 10:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:15.530079897 +0000 UTC m=+1044.819511545" watchObservedRunningTime="2026-02-19 10:02:15.534817605 +0000 UTC m=+1044.824249243" Feb 19 10:02:15 crc kubenswrapper[4873]: I0219 10:02:15.572395 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": EOF" Feb 19 10:02:16 crc kubenswrapper[4873]: I0219 10:02:16.054543 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 10:02:16 crc kubenswrapper[4873]: I0219 10:02:16.528686 4873 generic.go:334] "Generic (PLEG): container finished" podID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerID="e625952875a51eb5caf68d1f4611160ae7316ded4cd569ca37c367b0a5f6884b" exitCode=143 Feb 19 10:02:16 crc kubenswrapper[4873]: I0219 10:02:16.528784 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a575d51d-1ad3-422e-8e7c-b24b2c5de526","Type":"ContainerDied","Data":"e625952875a51eb5caf68d1f4611160ae7316ded4cd569ca37c367b0a5f6884b"} Feb 19 10:02:18 crc kubenswrapper[4873]: I0219 10:02:18.334018 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": read tcp 10.217.0.2:59434->10.217.0.151:9322: read: connection reset by peer" Feb 19 10:02:18 crc kubenswrapper[4873]: I0219 10:02:18.334965 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": dial tcp 10.217.0.151:9322: connect: connection refused" Feb 19 10:02:19 crc kubenswrapper[4873]: I0219 10:02:19.582996 4873 generic.go:334] "Generic (PLEG): container finished" podID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerID="0782a10551b13dd61abc8e02874c936098093870d1d93858b972201b9cd3c7da" exitCode=0 Feb 19 10:02:19 crc kubenswrapper[4873]: I0219 10:02:19.583075 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a575d51d-1ad3-422e-8e7c-b24b2c5de526","Type":"ContainerDied","Data":"0782a10551b13dd61abc8e02874c936098093870d1d93858b972201b9cd3c7da"} Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.488602 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79f476f4fc-dsgbh"] Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.512492 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-87df9b646-2jf26"] Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.513948 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.515384 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.533499 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-87df9b646-2jf26"] Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.595943 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76bfd776d9-fdg7f"] Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.608910 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6687d9896d-v96j2"] Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.620004 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6687d9896d-v96j2"] Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.620187 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.679486 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-config-data\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.679555 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-tls-certs\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.679621 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cace1157-1459-4823-aa8f-b2c246d3adeb-logs\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.679649 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-secret-key\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.679691 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt75m\" (UniqueName: \"kubernetes.io/projected/cace1157-1459-4823-aa8f-b2c246d3adeb-kube-api-access-jt75m\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.679709 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-scripts\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.679725 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-combined-ca-bundle\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.781011 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7929\" (UniqueName: \"kubernetes.io/projected/fa527f64-6e38-48c2-9927-a319f4579070-kube-api-access-f7929\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.781078 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-config-data\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.781140 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa527f64-6e38-48c2-9927-a319f4579070-combined-ca-bundle\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.781180 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-tls-certs\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.783240 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cace1157-1459-4823-aa8f-b2c246d3adeb-logs\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.783340 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa527f64-6e38-48c2-9927-a319f4579070-logs\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.783401 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-secret-key\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.783568 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa527f64-6e38-48c2-9927-a319f4579070-horizon-tls-certs\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.783622 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt75m\" (UniqueName: \"kubernetes.io/projected/cace1157-1459-4823-aa8f-b2c246d3adeb-kube-api-access-jt75m\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.783646 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-scripts\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.783673 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa527f64-6e38-48c2-9927-a319f4579070-scripts\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.783704 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-combined-ca-bundle\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.783776 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fa527f64-6e38-48c2-9927-a319f4579070-horizon-secret-key\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.783859 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa527f64-6e38-48c2-9927-a319f4579070-config-data\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.784607 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-config-data\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.784900 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cace1157-1459-4823-aa8f-b2c246d3adeb-logs\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.785282 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-scripts\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.790507 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-tls-certs\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.791587 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-combined-ca-bundle\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.806240 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-secret-key\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.815622 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt75m\" (UniqueName: \"kubernetes.io/projected/cace1157-1459-4823-aa8f-b2c246d3adeb-kube-api-access-jt75m\") pod \"horizon-87df9b646-2jf26\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.841451 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.887170 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa527f64-6e38-48c2-9927-a319f4579070-scripts\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.887233 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fa527f64-6e38-48c2-9927-a319f4579070-horizon-secret-key\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.887281 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa527f64-6e38-48c2-9927-a319f4579070-config-data\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.887364 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7929\" (UniqueName: \"kubernetes.io/projected/fa527f64-6e38-48c2-9927-a319f4579070-kube-api-access-f7929\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.887426 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa527f64-6e38-48c2-9927-a319f4579070-combined-ca-bundle\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.887504 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa527f64-6e38-48c2-9927-a319f4579070-logs\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.887560 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa527f64-6e38-48c2-9927-a319f4579070-horizon-tls-certs\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.889866 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa527f64-6e38-48c2-9927-a319f4579070-scripts\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.890158 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa527f64-6e38-48c2-9927-a319f4579070-logs\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.891924 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa527f64-6e38-48c2-9927-a319f4579070-config-data\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.894745 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa527f64-6e38-48c2-9927-a319f4579070-combined-ca-bundle\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.895280 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fa527f64-6e38-48c2-9927-a319f4579070-horizon-secret-key\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.901565 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa527f64-6e38-48c2-9927-a319f4579070-horizon-tls-certs\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.908888 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7929\" (UniqueName: \"kubernetes.io/projected/fa527f64-6e38-48c2-9927-a319f4579070-kube-api-access-f7929\") pod \"horizon-6687d9896d-v96j2\" (UID: \"fa527f64-6e38-48c2-9927-a319f4579070\") " pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:20 crc kubenswrapper[4873]: I0219 10:02:20.947812 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:21 crc kubenswrapper[4873]: I0219 10:02:21.617828 4873 generic.go:334] "Generic (PLEG): container finished" podID="f631ba50-5961-428e-83a5-a8ddb50085d3" containerID="dfe8a7cf5aeabc3bd0899d011af3258a91fc6d795682d82ec9f1f9c569448452" exitCode=0 Feb 19 10:02:21 crc kubenswrapper[4873]: I0219 10:02:21.617866 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nf742" event={"ID":"f631ba50-5961-428e-83a5-a8ddb50085d3","Type":"ContainerDied","Data":"dfe8a7cf5aeabc3bd0899d011af3258a91fc6d795682d82ec9f1f9c569448452"} Feb 19 10:02:21 crc kubenswrapper[4873]: I0219 10:02:21.762268 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:21 crc kubenswrapper[4873]: I0219 10:02:21.838635 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f77dfd79f-tg9w4"] Feb 19 10:02:21 crc kubenswrapper[4873]: I0219 10:02:21.838866 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerName="dnsmasq-dns" containerID="cri-o://3202df88b506237a1560baea9fc86854fa472069f50ad6c6f94a7855eaa6ff1a" gracePeriod=10 Feb 19 10:02:22 crc kubenswrapper[4873]: I0219 10:02:22.528353 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: connect: connection refused" Feb 19 10:02:22 crc kubenswrapper[4873]: I0219 10:02:22.634344 4873 generic.go:334] "Generic (PLEG): container finished" podID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerID="3202df88b506237a1560baea9fc86854fa472069f50ad6c6f94a7855eaa6ff1a" exitCode=0 Feb 19 10:02:22 crc kubenswrapper[4873]: I0219 10:02:22.634423 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" event={"ID":"8e8c0292-715e-4d4d-a552-5229adfc3e74","Type":"ContainerDied","Data":"3202df88b506237a1560baea9fc86854fa472069f50ad6c6f94a7855eaa6ff1a"} Feb 19 10:02:24 crc kubenswrapper[4873]: I0219 10:02:24.655950 4873 generic.go:334] "Generic (PLEG): container finished" podID="7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba" containerID="e5eec3e87329724888651bc35b53713711df95ef48801142ac4dd2488284d91d" exitCode=0 Feb 19 10:02:24 crc kubenswrapper[4873]: I0219 10:02:24.656071 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9472r" event={"ID":"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba","Type":"ContainerDied","Data":"e5eec3e87329724888651bc35b53713711df95ef48801142ac4dd2488284d91d"} Feb 19 10:02:26 crc kubenswrapper[4873]: I0219 10:02:26.054501 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 10:02:27 crc kubenswrapper[4873]: I0219 10:02:27.528692 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: connect: connection refused" Feb 19 10:02:30 crc kubenswrapper[4873]: E0219 10:02:30.184301 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 10:02:30 crc kubenswrapper[4873]: E0219 10:02:30.184730 4873 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 10:02:30 crc kubenswrapper[4873]: E0219 10:02:30.184907 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n675h65ch64ch558h87h697h55bhf6h59dhddh88hfh546hcch5c6h66ch56chdch547h5cbh59fh667hb5h5c7h7ch8ch6ch55bhfch89hf8h67q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xf7h7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-79f476f4fc-dsgbh_openstack(dca31fe9-df4d-4734-afcd-b0ebf4a54e4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 10:02:30 crc kubenswrapper[4873]: E0219 10:02:30.187389 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-79f476f4fc-dsgbh" podUID="dca31fe9-df4d-4734-afcd-b0ebf4a54e4d" Feb 19 10:02:31 crc kubenswrapper[4873]: I0219 10:02:31.055880 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.489227 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.489292 4873 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.489465 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:38.102.83.20:5001/podified-master-centos10/openstack-placement-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w9t4k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-98gbw_openstack(ec5489a2-23e2-4875-a19b-d15b4ad6c8c6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.490688 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-98gbw" podUID="ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.736134 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.20:5001/podified-master-centos10/openstack-placement-api:watcher_latest\\\"\"" pod="openstack/placement-db-sync-98gbw" podUID="ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.801025 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.801562 4873 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.801700 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.20:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n646h68ch56fh5dh5cch696h75h8ch556h67dh676h58bh547h579h9ch6h6ch655h67ch7ch544h654h5f7h686h9h66dh559h79h5d7h8dh75hc6q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4xxqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ab448dfd-a67c-49b5-a153-92a5a6f504b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.822482 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.822537 4873 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.822655 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c6h8ch659h576h658h544h54ch56ch594h65bh56ch54fhfh598h5b8h5c4h567h75h5dhc9hbhfch97hcbh684h559h5d9h64bhdbh555h668h689q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7vfmw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-76bfd776d9-fdg7f_openstack(6827937b-ebcc-45a6-98e3-08d49115503b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.825656 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-76bfd776d9-fdg7f" podUID="6827937b-ebcc-45a6-98e3-08d49115503b" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.846006 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.846913 4873 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.847169 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n567h546hb4h5c4h9fh67h64chbbh5b4h68fh659hcdh696h574h666h59h555hfdh55ch59fh656h67bh58h689h5b5h5c6h564h697h579h8dh674h5c4q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pdfl7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-98c8c74bf-wsl5f_openstack(c639af02-a4c8-40cf-947e-a50353ab2537): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 10:02:31 crc kubenswrapper[4873]: E0219 10:02:31.851918 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.20:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-98c8c74bf-wsl5f" podUID="c639af02-a4c8-40cf-947e-a50353ab2537" Feb 19 10:02:31 crc kubenswrapper[4873]: I0219 10:02:31.902624 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 10:02:31 crc kubenswrapper[4873]: I0219 10:02:31.908239 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:31 crc kubenswrapper[4873]: I0219 10:02:31.921641 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9472r" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.100033 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-custom-prometheus-ca\") pod \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101413 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-combined-ca-bundle\") pod \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101470 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sphnf\" (UniqueName: \"kubernetes.io/projected/f631ba50-5961-428e-83a5-a8ddb50085d3-kube-api-access-sphnf\") pod \"f631ba50-5961-428e-83a5-a8ddb50085d3\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101511 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-config-data\") pod \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101548 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-db-sync-config-data\") pod \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101614 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-config-data\") pod \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101649 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-credential-keys\") pod \"f631ba50-5961-428e-83a5-a8ddb50085d3\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101690 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a575d51d-1ad3-422e-8e7c-b24b2c5de526-logs\") pod \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101714 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-fernet-keys\") pod \"f631ba50-5961-428e-83a5-a8ddb50085d3\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101745 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-combined-ca-bundle\") pod \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101784 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bxrn\" (UniqueName: \"kubernetes.io/projected/a575d51d-1ad3-422e-8e7c-b24b2c5de526-kube-api-access-9bxrn\") pod \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\" (UID: \"a575d51d-1ad3-422e-8e7c-b24b2c5de526\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101812 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-combined-ca-bundle\") pod \"f631ba50-5961-428e-83a5-a8ddb50085d3\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101836 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-scripts\") pod \"f631ba50-5961-428e-83a5-a8ddb50085d3\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101861 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-config-data\") pod \"f631ba50-5961-428e-83a5-a8ddb50085d3\" (UID: \"f631ba50-5961-428e-83a5-a8ddb50085d3\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.101893 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btfz7\" (UniqueName: \"kubernetes.io/projected/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-kube-api-access-btfz7\") pod \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\" (UID: \"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba\") " Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.102411 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a575d51d-1ad3-422e-8e7c-b24b2c5de526-logs" (OuterVolumeSpecName: "logs") pod "a575d51d-1ad3-422e-8e7c-b24b2c5de526" (UID: "a575d51d-1ad3-422e-8e7c-b24b2c5de526"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.102653 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a575d51d-1ad3-422e-8e7c-b24b2c5de526-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.107251 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f631ba50-5961-428e-83a5-a8ddb50085d3" (UID: "f631ba50-5961-428e-83a5-a8ddb50085d3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.107268 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba" (UID: "7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.110332 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a575d51d-1ad3-422e-8e7c-b24b2c5de526-kube-api-access-9bxrn" (OuterVolumeSpecName: "kube-api-access-9bxrn") pod "a575d51d-1ad3-422e-8e7c-b24b2c5de526" (UID: "a575d51d-1ad3-422e-8e7c-b24b2c5de526"). InnerVolumeSpecName "kube-api-access-9bxrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.110451 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f631ba50-5961-428e-83a5-a8ddb50085d3" (UID: "f631ba50-5961-428e-83a5-a8ddb50085d3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.112383 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-kube-api-access-btfz7" (OuterVolumeSpecName: "kube-api-access-btfz7") pod "7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba" (UID: "7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba"). InnerVolumeSpecName "kube-api-access-btfz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.114859 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f631ba50-5961-428e-83a5-a8ddb50085d3-kube-api-access-sphnf" (OuterVolumeSpecName: "kube-api-access-sphnf") pod "f631ba50-5961-428e-83a5-a8ddb50085d3" (UID: "f631ba50-5961-428e-83a5-a8ddb50085d3"). InnerVolumeSpecName "kube-api-access-sphnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.121811 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-scripts" (OuterVolumeSpecName: "scripts") pod "f631ba50-5961-428e-83a5-a8ddb50085d3" (UID: "f631ba50-5961-428e-83a5-a8ddb50085d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.128033 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba" (UID: "7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.135247 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f631ba50-5961-428e-83a5-a8ddb50085d3" (UID: "f631ba50-5961-428e-83a5-a8ddb50085d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.140544 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-config-data" (OuterVolumeSpecName: "config-data") pod "f631ba50-5961-428e-83a5-a8ddb50085d3" (UID: "f631ba50-5961-428e-83a5-a8ddb50085d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.148213 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "a575d51d-1ad3-422e-8e7c-b24b2c5de526" (UID: "a575d51d-1ad3-422e-8e7c-b24b2c5de526"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.158818 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a575d51d-1ad3-422e-8e7c-b24b2c5de526" (UID: "a575d51d-1ad3-422e-8e7c-b24b2c5de526"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.160975 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-config-data" (OuterVolumeSpecName: "config-data") pod "7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba" (UID: "7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.187608 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-config-data" (OuterVolumeSpecName: "config-data") pod "a575d51d-1ad3-422e-8e7c-b24b2c5de526" (UID: "a575d51d-1ad3-422e-8e7c-b24b2c5de526"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204084 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204151 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sphnf\" (UniqueName: \"kubernetes.io/projected/f631ba50-5961-428e-83a5-a8ddb50085d3-kube-api-access-sphnf\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204196 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204207 4873 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204216 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204225 4873 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204232 4873 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204240 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204249 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bxrn\" (UniqueName: \"kubernetes.io/projected/a575d51d-1ad3-422e-8e7c-b24b2c5de526-kube-api-access-9bxrn\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204275 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204293 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204302 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631ba50-5961-428e-83a5-a8ddb50085d3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204313 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btfz7\" (UniqueName: \"kubernetes.io/projected/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba-kube-api-access-btfz7\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.204322 4873 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a575d51d-1ad3-422e-8e7c-b24b2c5de526-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.750450 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nf742" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.750404 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nf742" event={"ID":"f631ba50-5961-428e-83a5-a8ddb50085d3","Type":"ContainerDied","Data":"f5279a2fd1b4d18b8c04c2e7d237f62d2fc966d80132b8b737d8df303a78f856"} Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.750528 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5279a2fd1b4d18b8c04c2e7d237f62d2fc966d80132b8b737d8df303a78f856" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.753749 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.754012 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a575d51d-1ad3-422e-8e7c-b24b2c5de526","Type":"ContainerDied","Data":"581b3a000070ae451f6e9ef110e53c2e98989bd07a9fba9a09c69f4b1ecfba88"} Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.754044 4873 scope.go:117] "RemoveContainer" containerID="0782a10551b13dd61abc8e02874c936098093870d1d93858b972201b9cd3c7da" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.767778 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9472r" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.768260 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9472r" event={"ID":"7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba","Type":"ContainerDied","Data":"e77590cdf691d40582aa326fa0b9971a60cdea85c0908db700bc4cf93f06c74b"} Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.768306 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e77590cdf691d40582aa326fa0b9971a60cdea85c0908db700bc4cf93f06c74b" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.841814 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.853548 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.860863 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:32 crc kubenswrapper[4873]: E0219 10:02:32.861218 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba" containerName="glance-db-sync" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.861232 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba" containerName="glance-db-sync" Feb 19 10:02:32 crc kubenswrapper[4873]: E0219 10:02:32.861246 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.861252 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api" Feb 19 10:02:32 crc kubenswrapper[4873]: E0219 10:02:32.861270 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f631ba50-5961-428e-83a5-a8ddb50085d3" containerName="keystone-bootstrap" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.861277 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f631ba50-5961-428e-83a5-a8ddb50085d3" containerName="keystone-bootstrap" Feb 19 10:02:32 crc kubenswrapper[4873]: E0219 10:02:32.861288 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api-log" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.861293 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api-log" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.861468 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f631ba50-5961-428e-83a5-a8ddb50085d3" containerName="keystone-bootstrap" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.861483 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.861494 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba" containerName="glance-db-sync" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.861508 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api-log" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.862356 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.866974 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.899508 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.923055 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-logs\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.923093 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-config-data\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.923176 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrwmp\" (UniqueName: \"kubernetes.io/projected/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-kube-api-access-jrwmp\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.923196 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:32 crc kubenswrapper[4873]: I0219 10:02:32.923271 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.027805 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.027908 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-logs\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.027936 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-config-data\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.028000 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrwmp\" (UniqueName: \"kubernetes.io/projected/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-kube-api-access-jrwmp\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.028023 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.029755 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-logs\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.032442 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.037341 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.038061 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-config-data\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.047792 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrwmp\" (UniqueName: \"kubernetes.io/projected/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-kube-api-access-jrwmp\") pod \"watcher-api-0\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.151116 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nf742"] Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.161744 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nf742"] Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.191045 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.215890 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wrcpc"] Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.217096 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.218896 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.218991 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-27d74" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.219433 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.222962 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.223187 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.243655 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-credential-keys\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.243782 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-scripts\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.243843 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-config-data\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.244047 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jctkh\" (UniqueName: \"kubernetes.io/projected/58099bc8-1a29-467b-b13d-c0713e42e6c2-kube-api-access-jctkh\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.244176 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-fernet-keys\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.244196 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-combined-ca-bundle\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.253967 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wrcpc"] Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.345507 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-scripts\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.345797 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-config-data\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.345853 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jctkh\" (UniqueName: \"kubernetes.io/projected/58099bc8-1a29-467b-b13d-c0713e42e6c2-kube-api-access-jctkh\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.345889 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-fernet-keys\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.345904 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-combined-ca-bundle\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.345930 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-credential-keys\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.354690 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5687f4c549-n4g4v"] Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.356396 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.359563 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-fernet-keys\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.359640 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-config-data\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.365651 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-credential-keys\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.365899 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-combined-ca-bundle\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.367789 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5687f4c549-n4g4v"] Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.372184 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-scripts\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.393036 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jctkh\" (UniqueName: \"kubernetes.io/projected/58099bc8-1a29-467b-b13d-c0713e42e6c2-kube-api-access-jctkh\") pod \"keystone-bootstrap-wrcpc\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.447184 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-sb\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.447238 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-config\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.447267 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9mv6\" (UniqueName: \"kubernetes.io/projected/64deb684-42f6-4bb5-b774-ef57839a56d5-kube-api-access-x9mv6\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.447287 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.447458 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-swift-storage-0\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.447535 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-svc\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.500220 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" path="/var/lib/kubelet/pods/a575d51d-1ad3-422e-8e7c-b24b2c5de526/volumes" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.500939 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f631ba50-5961-428e-83a5-a8ddb50085d3" path="/var/lib/kubelet/pods/f631ba50-5961-428e-83a5-a8ddb50085d3/volumes" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.549542 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-swift-storage-0\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.549744 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-svc\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.549784 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-sb\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.549808 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-config\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.549828 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9mv6\" (UniqueName: \"kubernetes.io/projected/64deb684-42f6-4bb5-b774-ef57839a56d5-kube-api-access-x9mv6\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.549850 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.550485 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.550733 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-swift-storage-0\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.550754 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.551590 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-sb\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.551656 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-svc\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.551818 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-config\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.567663 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9mv6\" (UniqueName: \"kubernetes.io/projected/64deb684-42f6-4bb5-b774-ef57839a56d5-kube-api-access-x9mv6\") pod \"dnsmasq-dns-5687f4c549-n4g4v\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:33 crc kubenswrapper[4873]: I0219 10:02:33.788735 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.379931 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.381870 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.383597 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-n9qxt" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.384317 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.384522 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.416950 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.464689 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.464734 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-scripts\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.464778 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.464828 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.464916 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-logs\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.464938 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-288js\" (UniqueName: \"kubernetes.io/projected/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-kube-api-access-288js\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.465056 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-config-data\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.475865 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.477511 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.480223 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.489980 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.566866 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.567185 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-scripts\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.567220 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.567259 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.567286 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-logs\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.567304 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-288js\" (UniqueName: \"kubernetes.io/projected/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-kube-api-access-288js\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.567402 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-config-data\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.568223 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.568680 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.573308 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-scripts\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.573865 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-logs\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.582264 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.582882 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-config-data\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.591330 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-288js\" (UniqueName: \"kubernetes.io/projected/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-kube-api-access-288js\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.612654 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.669262 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.669320 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-config-data\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.669359 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-logs\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.669383 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.669477 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-scripts\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.669496 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxh4r\" (UniqueName: \"kubernetes.io/projected/efd66693-8b61-499e-a8c8-f8545b8fcced-kube-api-access-wxh4r\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.669544 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.720454 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.770861 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-config-data\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.770921 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-logs\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.770949 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.771038 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-scripts\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.771062 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxh4r\" (UniqueName: \"kubernetes.io/projected/efd66693-8b61-499e-a8c8-f8545b8fcced-kube-api-access-wxh4r\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.771099 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.771181 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.771317 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.772004 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-logs\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.772341 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.775026 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-scripts\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.780810 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.781453 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-config-data\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.801636 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxh4r\" (UniqueName: \"kubernetes.io/projected/efd66693-8b61-499e-a8c8-f8545b8fcced-kube-api-access-wxh4r\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:34 crc kubenswrapper[4873]: I0219 10:02:34.803217 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:35 crc kubenswrapper[4873]: I0219 10:02:35.093326 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 10:02:36 crc kubenswrapper[4873]: I0219 10:02:36.056680 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a575d51d-1ad3-422e-8e7c-b24b2c5de526" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 10:02:36 crc kubenswrapper[4873]: I0219 10:02:36.127165 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:02:36 crc kubenswrapper[4873]: I0219 10:02:36.213301 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:02:37 crc kubenswrapper[4873]: I0219 10:02:37.528666 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: i/o timeout" Feb 19 10:02:37 crc kubenswrapper[4873]: I0219 10:02:37.529008 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:02:42 crc kubenswrapper[4873]: I0219 10:02:42.529294 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: i/o timeout" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.518755 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.520803 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.526657 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.531081 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643201 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-config-data\") pod \"6827937b-ebcc-45a6-98e3-08d49115503b\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643306 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-config-data\") pod \"c639af02-a4c8-40cf-947e-a50353ab2537\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643339 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-config-data\") pod \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643368 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-nb\") pod \"8e8c0292-715e-4d4d-a552-5229adfc3e74\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643425 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-logs\") pod \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643455 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vfmw\" (UniqueName: \"kubernetes.io/projected/6827937b-ebcc-45a6-98e3-08d49115503b-kube-api-access-7vfmw\") pod \"6827937b-ebcc-45a6-98e3-08d49115503b\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643706 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-config\") pod \"8e8c0292-715e-4d4d-a552-5229adfc3e74\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643733 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-sb\") pod \"8e8c0292-715e-4d4d-a552-5229adfc3e74\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643792 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c639af02-a4c8-40cf-947e-a50353ab2537-logs\") pod \"c639af02-a4c8-40cf-947e-a50353ab2537\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643832 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-scripts\") pod \"6827937b-ebcc-45a6-98e3-08d49115503b\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643856 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdfl7\" (UniqueName: \"kubernetes.io/projected/c639af02-a4c8-40cf-947e-a50353ab2537-kube-api-access-pdfl7\") pod \"c639af02-a4c8-40cf-947e-a50353ab2537\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643915 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-scripts\") pod \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643950 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-swift-storage-0\") pod \"8e8c0292-715e-4d4d-a552-5229adfc3e74\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.643973 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-scripts\") pod \"c639af02-a4c8-40cf-947e-a50353ab2537\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.644000 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6827937b-ebcc-45a6-98e3-08d49115503b-logs\") pod \"6827937b-ebcc-45a6-98e3-08d49115503b\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.645637 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c639af02-a4c8-40cf-947e-a50353ab2537-logs" (OuterVolumeSpecName: "logs") pod "c639af02-a4c8-40cf-947e-a50353ab2537" (UID: "c639af02-a4c8-40cf-947e-a50353ab2537"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.645845 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6827937b-ebcc-45a6-98e3-08d49115503b-logs" (OuterVolumeSpecName: "logs") pod "6827937b-ebcc-45a6-98e3-08d49115503b" (UID: "6827937b-ebcc-45a6-98e3-08d49115503b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.645989 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf7h7\" (UniqueName: \"kubernetes.io/projected/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-kube-api-access-xf7h7\") pod \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.646186 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k6b5\" (UniqueName: \"kubernetes.io/projected/8e8c0292-715e-4d4d-a552-5229adfc3e74-kube-api-access-2k6b5\") pod \"8e8c0292-715e-4d4d-a552-5229adfc3e74\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.646331 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6827937b-ebcc-45a6-98e3-08d49115503b-horizon-secret-key\") pod \"6827937b-ebcc-45a6-98e3-08d49115503b\" (UID: \"6827937b-ebcc-45a6-98e3-08d49115503b\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.646421 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-scripts" (OuterVolumeSpecName: "scripts") pod "6827937b-ebcc-45a6-98e3-08d49115503b" (UID: "6827937b-ebcc-45a6-98e3-08d49115503b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.646627 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-svc\") pod \"8e8c0292-715e-4d4d-a552-5229adfc3e74\" (UID: \"8e8c0292-715e-4d4d-a552-5229adfc3e74\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.647015 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c639af02-a4c8-40cf-947e-a50353ab2537-horizon-secret-key\") pod \"c639af02-a4c8-40cf-947e-a50353ab2537\" (UID: \"c639af02-a4c8-40cf-947e-a50353ab2537\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.647214 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-horizon-secret-key\") pod \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\" (UID: \"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d\") " Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.647326 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-config-data" (OuterVolumeSpecName: "config-data") pod "6827937b-ebcc-45a6-98e3-08d49115503b" (UID: "6827937b-ebcc-45a6-98e3-08d49115503b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.647558 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-logs" (OuterVolumeSpecName: "logs") pod "dca31fe9-df4d-4734-afcd-b0ebf4a54e4d" (UID: "dca31fe9-df4d-4734-afcd-b0ebf4a54e4d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.647853 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-config-data" (OuterVolumeSpecName: "config-data") pod "dca31fe9-df4d-4734-afcd-b0ebf4a54e4d" (UID: "dca31fe9-df4d-4734-afcd-b0ebf4a54e4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.648916 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.648936 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.648951 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.648959 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c639af02-a4c8-40cf-947e-a50353ab2537-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.648969 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6827937b-ebcc-45a6-98e3-08d49115503b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.648977 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6827937b-ebcc-45a6-98e3-08d49115503b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.649058 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-config-data" (OuterVolumeSpecName: "config-data") pod "c639af02-a4c8-40cf-947e-a50353ab2537" (UID: "c639af02-a4c8-40cf-947e-a50353ab2537"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.654275 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-scripts" (OuterVolumeSpecName: "scripts") pod "dca31fe9-df4d-4734-afcd-b0ebf4a54e4d" (UID: "dca31fe9-df4d-4734-afcd-b0ebf4a54e4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.656380 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-scripts" (OuterVolumeSpecName: "scripts") pod "c639af02-a4c8-40cf-947e-a50353ab2537" (UID: "c639af02-a4c8-40cf-947e-a50353ab2537"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.658661 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c639af02-a4c8-40cf-947e-a50353ab2537-kube-api-access-pdfl7" (OuterVolumeSpecName: "kube-api-access-pdfl7") pod "c639af02-a4c8-40cf-947e-a50353ab2537" (UID: "c639af02-a4c8-40cf-947e-a50353ab2537"). InnerVolumeSpecName "kube-api-access-pdfl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.669232 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6827937b-ebcc-45a6-98e3-08d49115503b-kube-api-access-7vfmw" (OuterVolumeSpecName: "kube-api-access-7vfmw") pod "6827937b-ebcc-45a6-98e3-08d49115503b" (UID: "6827937b-ebcc-45a6-98e3-08d49115503b"). InnerVolumeSpecName "kube-api-access-7vfmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.674516 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c639af02-a4c8-40cf-947e-a50353ab2537-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c639af02-a4c8-40cf-947e-a50353ab2537" (UID: "c639af02-a4c8-40cf-947e-a50353ab2537"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.693301 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6827937b-ebcc-45a6-98e3-08d49115503b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6827937b-ebcc-45a6-98e3-08d49115503b" (UID: "6827937b-ebcc-45a6-98e3-08d49115503b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.693557 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-kube-api-access-xf7h7" (OuterVolumeSpecName: "kube-api-access-xf7h7") pod "dca31fe9-df4d-4734-afcd-b0ebf4a54e4d" (UID: "dca31fe9-df4d-4734-afcd-b0ebf4a54e4d"). InnerVolumeSpecName "kube-api-access-xf7h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.695649 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dca31fe9-df4d-4734-afcd-b0ebf4a54e4d" (UID: "dca31fe9-df4d-4734-afcd-b0ebf4a54e4d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.696231 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8c0292-715e-4d4d-a552-5229adfc3e74-kube-api-access-2k6b5" (OuterVolumeSpecName: "kube-api-access-2k6b5") pod "8e8c0292-715e-4d4d-a552-5229adfc3e74" (UID: "8e8c0292-715e-4d4d-a552-5229adfc3e74"). InnerVolumeSpecName "kube-api-access-2k6b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.728295 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e8c0292-715e-4d4d-a552-5229adfc3e74" (UID: "8e8c0292-715e-4d4d-a552-5229adfc3e74"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.728549 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e8c0292-715e-4d4d-a552-5229adfc3e74" (UID: "8e8c0292-715e-4d4d-a552-5229adfc3e74"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.728774 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8e8c0292-715e-4d4d-a552-5229adfc3e74" (UID: "8e8c0292-715e-4d4d-a552-5229adfc3e74"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.729934 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-config" (OuterVolumeSpecName: "config") pod "8e8c0292-715e-4d4d-a552-5229adfc3e74" (UID: "8e8c0292-715e-4d4d-a552-5229adfc3e74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.738682 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e8c0292-715e-4d4d-a552-5229adfc3e74" (UID: "8e8c0292-715e-4d4d-a552-5229adfc3e74"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751204 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vfmw\" (UniqueName: \"kubernetes.io/projected/6827937b-ebcc-45a6-98e3-08d49115503b-kube-api-access-7vfmw\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751243 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751257 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751268 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdfl7\" (UniqueName: \"kubernetes.io/projected/c639af02-a4c8-40cf-947e-a50353ab2537-kube-api-access-pdfl7\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751279 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751290 4873 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751300 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751311 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf7h7\" (UniqueName: \"kubernetes.io/projected/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-kube-api-access-xf7h7\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751322 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k6b5\" (UniqueName: \"kubernetes.io/projected/8e8c0292-715e-4d4d-a552-5229adfc3e74-kube-api-access-2k6b5\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751333 4873 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6827937b-ebcc-45a6-98e3-08d49115503b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751344 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751355 4873 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c639af02-a4c8-40cf-947e-a50353ab2537-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751367 4873 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751377 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c639af02-a4c8-40cf-947e-a50353ab2537-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.751389 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e8c0292-715e-4d4d-a552-5229adfc3e74-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.877801 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-98c8c74bf-wsl5f" event={"ID":"c639af02-a4c8-40cf-947e-a50353ab2537","Type":"ContainerDied","Data":"9eac663eb58aa13e3523f32d4bcb37aa001e4ee953f53b136a077545dfbf1008"} Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.877847 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-98c8c74bf-wsl5f" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.879363 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bfd776d9-fdg7f" event={"ID":"6827937b-ebcc-45a6-98e3-08d49115503b","Type":"ContainerDied","Data":"3316c81d04a3ec6d98aa8cad078c0b0a6499f0b69e6af433f3ff3dc9ecbf7c2c"} Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.879411 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bfd776d9-fdg7f" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.882566 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.882550 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" event={"ID":"8e8c0292-715e-4d4d-a552-5229adfc3e74","Type":"ContainerDied","Data":"04cdcce41cf06a6a4c22d13c5a42c60370cf2135128656d07548457adad958ae"} Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.884881 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f476f4fc-dsgbh" event={"ID":"dca31fe9-df4d-4734-afcd-b0ebf4a54e4d","Type":"ContainerDied","Data":"60cadfc4a70dc8e7874b849ac63b6a6b2ae5cbe2b77781abafab78ab09e65314"} Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.884969 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f476f4fc-dsgbh" Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.926963 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f77dfd79f-tg9w4"] Feb 19 10:02:43 crc kubenswrapper[4873]: I0219 10:02:43.933976 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f77dfd79f-tg9w4"] Feb 19 10:02:44 crc kubenswrapper[4873]: I0219 10:02:44.013081 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76bfd776d9-fdg7f"] Feb 19 10:02:44 crc kubenswrapper[4873]: I0219 10:02:44.033360 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76bfd776d9-fdg7f"] Feb 19 10:02:44 crc kubenswrapper[4873]: I0219 10:02:44.063765 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79f476f4fc-dsgbh"] Feb 19 10:02:44 crc kubenswrapper[4873]: I0219 10:02:44.073857 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79f476f4fc-dsgbh"] Feb 19 10:02:44 crc kubenswrapper[4873]: I0219 10:02:44.088753 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-98c8c74bf-wsl5f"] Feb 19 10:02:44 crc kubenswrapper[4873]: I0219 10:02:44.095677 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-98c8c74bf-wsl5f"] Feb 19 10:02:44 crc kubenswrapper[4873]: E0219 10:02:44.562005 4873 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Feb 19 10:02:44 crc kubenswrapper[4873]: E0219 10:02:44.562056 4873 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Feb 19 10:02:44 crc kubenswrapper[4873]: E0219 10:02:44.562216 4873 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.20:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k7ttv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-gqrb5_openstack(ce5accb4-1da0-4a21-a289-7dba33ad935f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 10:02:44 crc kubenswrapper[4873]: E0219 10:02:44.563413 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-gqrb5" podUID="ce5accb4-1da0-4a21-a289-7dba33ad935f" Feb 19 10:02:44 crc kubenswrapper[4873]: I0219 10:02:44.588335 4873 scope.go:117] "RemoveContainer" containerID="e625952875a51eb5caf68d1f4611160ae7316ded4cd569ca37c367b0a5f6884b" Feb 19 10:02:44 crc kubenswrapper[4873]: E0219 10:02:44.900593 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.20:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-gqrb5" podUID="ce5accb4-1da0-4a21-a289-7dba33ad935f" Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.017868 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6687d9896d-v96j2"] Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.177353 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-87df9b646-2jf26"] Feb 19 10:02:45 crc kubenswrapper[4873]: W0219 10:02:45.442090 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa527f64_6e38_48c2_9927_a319f4579070.slice/crio-080ebd3682c04f85d87960e452b1d2aad6833e4217c592f0f75127df01aa50cf WatchSource:0}: Error finding container 080ebd3682c04f85d87960e452b1d2aad6833e4217c592f0f75127df01aa50cf: Status 404 returned error can't find the container with id 080ebd3682c04f85d87960e452b1d2aad6833e4217c592f0f75127df01aa50cf Feb 19 10:02:45 crc kubenswrapper[4873]: W0219 10:02:45.445749 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcace1157_1459_4823_aa8f_b2c246d3adeb.slice/crio-9b51de3389b17ee12d7f59af44e9eef14565d045f0ca62918a72e7a072d8c72e WatchSource:0}: Error finding container 9b51de3389b17ee12d7f59af44e9eef14565d045f0ca62918a72e7a072d8c72e: Status 404 returned error can't find the container with id 9b51de3389b17ee12d7f59af44e9eef14565d045f0ca62918a72e7a072d8c72e Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.530705 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6827937b-ebcc-45a6-98e3-08d49115503b" path="/var/lib/kubelet/pods/6827937b-ebcc-45a6-98e3-08d49115503b/volumes" Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.531479 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" path="/var/lib/kubelet/pods/8e8c0292-715e-4d4d-a552-5229adfc3e74/volumes" Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.533093 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c639af02-a4c8-40cf-947e-a50353ab2537" path="/var/lib/kubelet/pods/c639af02-a4c8-40cf-947e-a50353ab2537/volumes" Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.534013 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca31fe9-df4d-4734-afcd-b0ebf4a54e4d" path="/var/lib/kubelet/pods/dca31fe9-df4d-4734-afcd-b0ebf4a54e4d/volumes" Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.580202 4873 scope.go:117] "RemoveContainer" containerID="3202df88b506237a1560baea9fc86854fa472069f50ad6c6f94a7855eaa6ff1a" Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.714046 4873 scope.go:117] "RemoveContainer" containerID="c38f23c9308a52dc889562a59a6b3d3134f3aebd40d9ab2a2804a839bf127153" Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.916050 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687d9896d-v96j2" event={"ID":"fa527f64-6e38-48c2-9927-a319f4579070","Type":"ContainerStarted","Data":"080ebd3682c04f85d87960e452b1d2aad6833e4217c592f0f75127df01aa50cf"} Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.918588 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87df9b646-2jf26" event={"ID":"cace1157-1459-4823-aa8f-b2c246d3adeb","Type":"ContainerStarted","Data":"9b51de3389b17ee12d7f59af44e9eef14565d045f0ca62918a72e7a072d8c72e"} Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.921354 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"95402218-fbbb-4453-aba6-d135ba3a26bd","Type":"ContainerStarted","Data":"99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed"} Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.955617 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wrcpc"] Feb 19 10:02:45 crc kubenswrapper[4873]: I0219 10:02:45.960450 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=5.416168261 podStartE2EDuration="35.960431043s" podCreationTimestamp="2026-02-19 10:02:10 +0000 UTC" firstStartedPulling="2026-02-19 10:02:12.800174861 +0000 UTC m=+1042.089606499" lastFinishedPulling="2026-02-19 10:02:43.344437613 +0000 UTC m=+1072.633869281" observedRunningTime="2026-02-19 10:02:45.942672372 +0000 UTC m=+1075.232104010" watchObservedRunningTime="2026-02-19 10:02:45.960431043 +0000 UTC m=+1075.249862671" Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.070995 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5687f4c549-n4g4v"] Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.170854 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.259803 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.291784 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.393983 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:02:46 crc kubenswrapper[4873]: W0219 10:02:46.418585 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefd66693_8b61_499e_a8c8_f8545b8fcced.slice/crio-3a663286d199f6315c6fc0d35eabda8f21a34ebbb20580efa338c69320c9e613 WatchSource:0}: Error finding container 3a663286d199f6315c6fc0d35eabda8f21a34ebbb20580efa338c69320c9e613: Status 404 returned error can't find the container with id 3a663286d199f6315c6fc0d35eabda8f21a34ebbb20580efa338c69320c9e613 Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.935499 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b8008736-31ec-491c-aa52-03b9413feab9","Type":"ContainerStarted","Data":"a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149"} Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.939423 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e3a8bf8-a885-4a50-97d2-53df598b1ce9","Type":"ContainerStarted","Data":"ab78ff8f102582e8303092caf70a3741f3f6463262da1956c73f2a02fe74dcaa"} Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.943305 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4pv5z" event={"ID":"943d069e-6ad4-4411-b937-c4499f0ced6f","Type":"ContainerStarted","Data":"e48e8c3f3cd0f5266c11f4e70aa13217161d56e20c834665683a06bd7308e111"} Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.946580 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab448dfd-a67c-49b5-a153-92a5a6f504b2","Type":"ContainerStarted","Data":"ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e"} Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.964040 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=17.92166552 podStartE2EDuration="36.964020801s" podCreationTimestamp="2026-02-19 10:02:10 +0000 UTC" firstStartedPulling="2026-02-19 10:02:12.739310721 +0000 UTC m=+1042.028742369" lastFinishedPulling="2026-02-19 10:02:31.781666012 +0000 UTC m=+1061.071097650" observedRunningTime="2026-02-19 10:02:46.960310029 +0000 UTC m=+1076.249741667" watchObservedRunningTime="2026-02-19 10:02:46.964020801 +0000 UTC m=+1076.253452449" Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.983712 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687d9896d-v96j2" event={"ID":"fa527f64-6e38-48c2-9927-a319f4579070","Type":"ContainerStarted","Data":"41ce1383b1efd52fcaaa5f36442aae9782ce54733f0cce74dcd958919193025c"} Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.983945 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687d9896d-v96j2" event={"ID":"fa527f64-6e38-48c2-9927-a319f4579070","Type":"ContainerStarted","Data":"e4d390edab0549e2140b5a29610a7484352c87ec56208011ef852543b6dab746"} Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.989819 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2","Type":"ContainerStarted","Data":"63ad458896e287b047bf524f55c92f7c8b727110e2a9fed53b0acb11b2c99b24"} Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.989860 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2","Type":"ContainerStarted","Data":"cd1c8470e0eda76a81d6e96f1fc492dbb2365558bb9cd6d4a1af5b28681c31e0"} Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.989870 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2","Type":"ContainerStarted","Data":"5c219368094a8a1c527a293923d373b84ac19d1c24f32f1ef5514bf9fa1b2437"} Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.990989 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.993412 4873 generic.go:334] "Generic (PLEG): container finished" podID="64deb684-42f6-4bb5-b774-ef57839a56d5" containerID="2156fdadae7d71bb536233ced37bfe76646867be4fb2b42c0784cff65fb2da11" exitCode=0 Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.993463 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" event={"ID":"64deb684-42f6-4bb5-b774-ef57839a56d5","Type":"ContainerDied","Data":"2156fdadae7d71bb536233ced37bfe76646867be4fb2b42c0784cff65fb2da11"} Feb 19 10:02:46 crc kubenswrapper[4873]: I0219 10:02:46.993482 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" event={"ID":"64deb684-42f6-4bb5-b774-ef57839a56d5","Type":"ContainerStarted","Data":"de0152a57feb0720c3ff97d1d52995f66e5e8c9b3cc0aff67e6dfa65b92a668a"} Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.007449 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-4pv5z" podStartSLOduration=5.487297785 podStartE2EDuration="37.007432358s" podCreationTimestamp="2026-02-19 10:02:10 +0000 UTC" firstStartedPulling="2026-02-19 10:02:12.987773295 +0000 UTC m=+1042.277204933" lastFinishedPulling="2026-02-19 10:02:44.507907848 +0000 UTC m=+1073.797339506" observedRunningTime="2026-02-19 10:02:46.983480664 +0000 UTC m=+1076.272912302" watchObservedRunningTime="2026-02-19 10:02:47.007432358 +0000 UTC m=+1076.296863996" Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.024395 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd66693-8b61-499e-a8c8-f8545b8fcced","Type":"ContainerStarted","Data":"3a663286d199f6315c6fc0d35eabda8f21a34ebbb20580efa338c69320c9e613"} Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.055008 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6687d9896d-v96j2" podStartSLOduration=26.681275187 podStartE2EDuration="27.054988648s" podCreationTimestamp="2026-02-19 10:02:20 +0000 UTC" firstStartedPulling="2026-02-19 10:02:45.476476567 +0000 UTC m=+1074.765908215" lastFinishedPulling="2026-02-19 10:02:45.850190028 +0000 UTC m=+1075.139621676" observedRunningTime="2026-02-19 10:02:47.007397037 +0000 UTC m=+1076.296828675" watchObservedRunningTime="2026-02-19 10:02:47.054988648 +0000 UTC m=+1076.344420276" Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.063056 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87df9b646-2jf26" event={"ID":"cace1157-1459-4823-aa8f-b2c246d3adeb","Type":"ContainerStarted","Data":"f728a5cace0f3c84844ee9bd7c5a0c48b5b5cad808dd5c682427cb942eb77db6"} Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.063094 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87df9b646-2jf26" event={"ID":"cace1157-1459-4823-aa8f-b2c246d3adeb","Type":"ContainerStarted","Data":"1a1b6f4ba694daddb17f029a0bbce06c79e8294e69f096dade9d91ac98c03f81"} Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.075373 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-98gbw" event={"ID":"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6","Type":"ContainerStarted","Data":"ad115d69dacdb41d674a33e1db809c1ccc6821733d5a2f7e41e2ae5cb63809b4"} Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.078879 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=15.07884934 podStartE2EDuration="15.07884934s" podCreationTimestamp="2026-02-19 10:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:47.026777928 +0000 UTC m=+1076.316209566" watchObservedRunningTime="2026-02-19 10:02:47.07884934 +0000 UTC m=+1076.368280978" Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.097248 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wrcpc" event={"ID":"58099bc8-1a29-467b-b13d-c0713e42e6c2","Type":"ContainerStarted","Data":"a196b181363d1056b517a86bc31a20f9a28399d782296cb45561ba646a621a77"} Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.097299 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wrcpc" event={"ID":"58099bc8-1a29-467b-b13d-c0713e42e6c2","Type":"ContainerStarted","Data":"ed07e84b25a983ea418f7869298191a80a5dc7f605f24c8aefa1a6f1b3d88cd5"} Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.119473 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-87df9b646-2jf26" podStartSLOduration=26.769485796 podStartE2EDuration="27.119452987s" podCreationTimestamp="2026-02-19 10:02:20 +0000 UTC" firstStartedPulling="2026-02-19 10:02:45.476455557 +0000 UTC m=+1074.765887215" lastFinishedPulling="2026-02-19 10:02:45.826422748 +0000 UTC m=+1075.115854406" observedRunningTime="2026-02-19 10:02:47.080913461 +0000 UTC m=+1076.370345099" watchObservedRunningTime="2026-02-19 10:02:47.119452987 +0000 UTC m=+1076.408884625" Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.127896 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-98gbw" podStartSLOduration=3.722887562 podStartE2EDuration="36.127877836s" podCreationTimestamp="2026-02-19 10:02:11 +0000 UTC" firstStartedPulling="2026-02-19 10:02:13.225126884 +0000 UTC m=+1042.514558522" lastFinishedPulling="2026-02-19 10:02:45.630117148 +0000 UTC m=+1074.919548796" observedRunningTime="2026-02-19 10:02:47.099682517 +0000 UTC m=+1076.389114155" watchObservedRunningTime="2026-02-19 10:02:47.127877836 +0000 UTC m=+1076.417309474" Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.136768 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wrcpc" podStartSLOduration=14.136753087 podStartE2EDuration="14.136753087s" podCreationTimestamp="2026-02-19 10:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:47.125175139 +0000 UTC m=+1076.414606777" watchObservedRunningTime="2026-02-19 10:02:47.136753087 +0000 UTC m=+1076.426184725" Feb 19 10:02:47 crc kubenswrapper[4873]: I0219 10:02:47.533216 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f77dfd79f-tg9w4" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: i/o timeout" Feb 19 10:02:48 crc kubenswrapper[4873]: I0219 10:02:48.139907 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd66693-8b61-499e-a8c8-f8545b8fcced","Type":"ContainerStarted","Data":"d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784"} Feb 19 10:02:48 crc kubenswrapper[4873]: I0219 10:02:48.141911 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e3a8bf8-a885-4a50-97d2-53df598b1ce9","Type":"ContainerStarted","Data":"da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3"} Feb 19 10:02:48 crc kubenswrapper[4873]: I0219 10:02:48.150979 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" event={"ID":"64deb684-42f6-4bb5-b774-ef57839a56d5","Type":"ContainerStarted","Data":"b6d3e647789f4d02e72fca723b661c289177f805a67c5edd75747ee3947add92"} Feb 19 10:02:48 crc kubenswrapper[4873]: I0219 10:02:48.175439 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" podStartSLOduration=15.175412285 podStartE2EDuration="15.175412285s" podCreationTimestamp="2026-02-19 10:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:48.168492683 +0000 UTC m=+1077.457924321" watchObservedRunningTime="2026-02-19 10:02:48.175412285 +0000 UTC m=+1077.464843923" Feb 19 10:02:48 crc kubenswrapper[4873]: I0219 10:02:48.192360 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 10:02:48 crc kubenswrapper[4873]: I0219 10:02:48.789469 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.163151 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd66693-8b61-499e-a8c8-f8545b8fcced","Type":"ContainerStarted","Data":"f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269"} Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.163224 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="efd66693-8b61-499e-a8c8-f8545b8fcced" containerName="glance-log" containerID="cri-o://d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784" gracePeriod=30 Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.163306 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="efd66693-8b61-499e-a8c8-f8545b8fcced" containerName="glance-httpd" containerID="cri-o://f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269" gracePeriod=30 Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.165706 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e3a8bf8-a885-4a50-97d2-53df598b1ce9","Type":"ContainerStarted","Data":"d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9"} Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.165964 4873 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.166438 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6e3a8bf8-a885-4a50-97d2-53df598b1ce9" containerName="glance-log" containerID="cri-o://da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3" gracePeriod=30 Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.166600 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6e3a8bf8-a885-4a50-97d2-53df598b1ce9" containerName="glance-httpd" containerID="cri-o://d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9" gracePeriod=30 Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.190879 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.190847706 podStartE2EDuration="16.190847706s" podCreationTimestamp="2026-02-19 10:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:49.183402541 +0000 UTC m=+1078.472834179" watchObservedRunningTime="2026-02-19 10:02:49.190847706 +0000 UTC m=+1078.480279344" Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.213469 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.213449827 podStartE2EDuration="16.213449827s" podCreationTimestamp="2026-02-19 10:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:49.202937586 +0000 UTC m=+1078.492369214" watchObservedRunningTime="2026-02-19 10:02:49.213449827 +0000 UTC m=+1078.502881465" Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.872645 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.954461 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.992438 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-scripts\") pod \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.992505 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-288js\" (UniqueName: \"kubernetes.io/projected/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-kube-api-access-288js\") pod \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.992548 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-config-data\") pod \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.992608 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-httpd-run\") pod \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.992640 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.992713 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-combined-ca-bundle\") pod \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.992732 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-logs\") pod \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\" (UID: \"6e3a8bf8-a885-4a50-97d2-53df598b1ce9\") " Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.996238 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-logs" (OuterVolumeSpecName: "logs") pod "6e3a8bf8-a885-4a50-97d2-53df598b1ce9" (UID: "6e3a8bf8-a885-4a50-97d2-53df598b1ce9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:49 crc kubenswrapper[4873]: I0219 10:02:49.997450 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6e3a8bf8-a885-4a50-97d2-53df598b1ce9" (UID: "6e3a8bf8-a885-4a50-97d2-53df598b1ce9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.002476 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-scripts" (OuterVolumeSpecName: "scripts") pod "6e3a8bf8-a885-4a50-97d2-53df598b1ce9" (UID: "6e3a8bf8-a885-4a50-97d2-53df598b1ce9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.002492 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "6e3a8bf8-a885-4a50-97d2-53df598b1ce9" (UID: "6e3a8bf8-a885-4a50-97d2-53df598b1ce9"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.007264 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-kube-api-access-288js" (OuterVolumeSpecName: "kube-api-access-288js") pod "6e3a8bf8-a885-4a50-97d2-53df598b1ce9" (UID: "6e3a8bf8-a885-4a50-97d2-53df598b1ce9"). InnerVolumeSpecName "kube-api-access-288js". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.033222 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e3a8bf8-a885-4a50-97d2-53df598b1ce9" (UID: "6e3a8bf8-a885-4a50-97d2-53df598b1ce9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.066703 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-config-data" (OuterVolumeSpecName: "config-data") pod "6e3a8bf8-a885-4a50-97d2-53df598b1ce9" (UID: "6e3a8bf8-a885-4a50-97d2-53df598b1ce9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.096780 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-logs\") pod \"efd66693-8b61-499e-a8c8-f8545b8fcced\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.096831 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-httpd-run\") pod \"efd66693-8b61-499e-a8c8-f8545b8fcced\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.096859 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"efd66693-8b61-499e-a8c8-f8545b8fcced\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.096935 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxh4r\" (UniqueName: \"kubernetes.io/projected/efd66693-8b61-499e-a8c8-f8545b8fcced-kube-api-access-wxh4r\") pod \"efd66693-8b61-499e-a8c8-f8545b8fcced\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.097026 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-scripts\") pod \"efd66693-8b61-499e-a8c8-f8545b8fcced\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.097055 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-config-data\") pod \"efd66693-8b61-499e-a8c8-f8545b8fcced\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.097129 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-combined-ca-bundle\") pod \"efd66693-8b61-499e-a8c8-f8545b8fcced\" (UID: \"efd66693-8b61-499e-a8c8-f8545b8fcced\") " Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.097441 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.097452 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.097460 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.097468 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-288js\" (UniqueName: \"kubernetes.io/projected/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-kube-api-access-288js\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.097478 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.097486 4873 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e3a8bf8-a885-4a50-97d2-53df598b1ce9-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.097504 4873 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.101359 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "efd66693-8b61-499e-a8c8-f8545b8fcced" (UID: "efd66693-8b61-499e-a8c8-f8545b8fcced"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.101598 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd66693-8b61-499e-a8c8-f8545b8fcced-kube-api-access-wxh4r" (OuterVolumeSpecName: "kube-api-access-wxh4r") pod "efd66693-8b61-499e-a8c8-f8545b8fcced" (UID: "efd66693-8b61-499e-a8c8-f8545b8fcced"). InnerVolumeSpecName "kube-api-access-wxh4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.101611 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-logs" (OuterVolumeSpecName: "logs") pod "efd66693-8b61-499e-a8c8-f8545b8fcced" (UID: "efd66693-8b61-499e-a8c8-f8545b8fcced"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.104333 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-scripts" (OuterVolumeSpecName: "scripts") pod "efd66693-8b61-499e-a8c8-f8545b8fcced" (UID: "efd66693-8b61-499e-a8c8-f8545b8fcced"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.107253 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "efd66693-8b61-499e-a8c8-f8545b8fcced" (UID: "efd66693-8b61-499e-a8c8-f8545b8fcced"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.117791 4873 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.136283 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efd66693-8b61-499e-a8c8-f8545b8fcced" (UID: "efd66693-8b61-499e-a8c8-f8545b8fcced"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.159846 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-config-data" (OuterVolumeSpecName: "config-data") pod "efd66693-8b61-499e-a8c8-f8545b8fcced" (UID: "efd66693-8b61-499e-a8c8-f8545b8fcced"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.183942 4873 generic.go:334] "Generic (PLEG): container finished" podID="6e3a8bf8-a885-4a50-97d2-53df598b1ce9" containerID="d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9" exitCode=0 Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.184971 4873 generic.go:334] "Generic (PLEG): container finished" podID="6e3a8bf8-a885-4a50-97d2-53df598b1ce9" containerID="da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3" exitCode=143 Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.184074 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.184027 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e3a8bf8-a885-4a50-97d2-53df598b1ce9","Type":"ContainerDied","Data":"d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9"} Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.185359 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e3a8bf8-a885-4a50-97d2-53df598b1ce9","Type":"ContainerDied","Data":"da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3"} Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.185381 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6e3a8bf8-a885-4a50-97d2-53df598b1ce9","Type":"ContainerDied","Data":"ab78ff8f102582e8303092caf70a3741f3f6463262da1956c73f2a02fe74dcaa"} Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.185401 4873 scope.go:117] "RemoveContainer" containerID="d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.190352 4873 generic.go:334] "Generic (PLEG): container finished" podID="efd66693-8b61-499e-a8c8-f8545b8fcced" containerID="f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269" exitCode=0 Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.190381 4873 generic.go:334] "Generic (PLEG): container finished" podID="efd66693-8b61-499e-a8c8-f8545b8fcced" containerID="d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784" exitCode=143 Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.191306 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.192462 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd66693-8b61-499e-a8c8-f8545b8fcced","Type":"ContainerDied","Data":"f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269"} Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.192538 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd66693-8b61-499e-a8c8-f8545b8fcced","Type":"ContainerDied","Data":"d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784"} Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.192551 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd66693-8b61-499e-a8c8-f8545b8fcced","Type":"ContainerDied","Data":"3a663286d199f6315c6fc0d35eabda8f21a34ebbb20580efa338c69320c9e613"} Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.192650 4873 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.198718 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.198749 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.198759 4873 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.198768 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd66693-8b61-499e-a8c8-f8545b8fcced-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.198777 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.198785 4873 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efd66693-8b61-499e-a8c8-f8545b8fcced-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.198812 4873 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.198821 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxh4r\" (UniqueName: \"kubernetes.io/projected/efd66693-8b61-499e-a8c8-f8545b8fcced-kube-api-access-wxh4r\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.217717 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.224065 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.237198 4873 scope.go:117] "RemoveContainer" containerID="da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.253055 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.261005 4873 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.273002 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.276685 4873 scope.go:117] "RemoveContainer" containerID="d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9" Feb 19 10:02:50 crc kubenswrapper[4873]: E0219 10:02:50.292204 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9\": container with ID starting with d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9 not found: ID does not exist" containerID="d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.292444 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9"} err="failed to get container status \"d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9\": rpc error: code = NotFound desc = could not find container \"d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9\": container with ID starting with d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9 not found: ID does not exist" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.292522 4873 scope.go:117] "RemoveContainer" containerID="da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3" Feb 19 10:02:50 crc kubenswrapper[4873]: E0219 10:02:50.302365 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3\": container with ID starting with da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3 not found: ID does not exist" containerID="da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.302409 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3"} err="failed to get container status \"da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3\": rpc error: code = NotFound desc = could not find container \"da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3\": container with ID starting with da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3 not found: ID does not exist" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.302440 4873 scope.go:117] "RemoveContainer" containerID="d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.302850 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9"} err="failed to get container status \"d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9\": rpc error: code = NotFound desc = could not find container \"d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9\": container with ID starting with d8c9d310f509d718822d8a656e17a11949d51434323480d7753635e919f55fb9 not found: ID does not exist" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.302897 4873 scope.go:117] "RemoveContainer" containerID="da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.303593 4873 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.310853 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3"} err="failed to get container status \"da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3\": rpc error: code = NotFound desc = could not find container \"da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3\": container with ID starting with da8e8797e08ad3efb1311ccadaefae92c7b004bdf2afd3678b7f82b472bcc5e3 not found: ID does not exist" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.310906 4873 scope.go:117] "RemoveContainer" containerID="f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.325036 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:02:50 crc kubenswrapper[4873]: E0219 10:02:50.326161 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3a8bf8-a885-4a50-97d2-53df598b1ce9" containerName="glance-httpd" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.326242 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3a8bf8-a885-4a50-97d2-53df598b1ce9" containerName="glance-httpd" Feb 19 10:02:50 crc kubenswrapper[4873]: E0219 10:02:50.326301 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerName="dnsmasq-dns" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.326347 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerName="dnsmasq-dns" Feb 19 10:02:50 crc kubenswrapper[4873]: E0219 10:02:50.326413 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd66693-8b61-499e-a8c8-f8545b8fcced" containerName="glance-log" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.326461 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd66693-8b61-499e-a8c8-f8545b8fcced" containerName="glance-log" Feb 19 10:02:50 crc kubenswrapper[4873]: E0219 10:02:50.326518 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd66693-8b61-499e-a8c8-f8545b8fcced" containerName="glance-httpd" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.326565 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd66693-8b61-499e-a8c8-f8545b8fcced" containerName="glance-httpd" Feb 19 10:02:50 crc kubenswrapper[4873]: E0219 10:02:50.326624 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerName="init" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.326672 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerName="init" Feb 19 10:02:50 crc kubenswrapper[4873]: E0219 10:02:50.326732 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3a8bf8-a885-4a50-97d2-53df598b1ce9" containerName="glance-log" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.326859 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3a8bf8-a885-4a50-97d2-53df598b1ce9" containerName="glance-log" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.327326 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3a8bf8-a885-4a50-97d2-53df598b1ce9" containerName="glance-httpd" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.353319 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3a8bf8-a885-4a50-97d2-53df598b1ce9" containerName="glance-log" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.353366 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd66693-8b61-499e-a8c8-f8545b8fcced" containerName="glance-log" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.353393 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8c0292-715e-4d4d-a552-5229adfc3e74" containerName="dnsmasq-dns" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.353414 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd66693-8b61-499e-a8c8-f8545b8fcced" containerName="glance-httpd" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.355227 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.355290 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.355376 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.362175 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.370236 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.370801 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.370869 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-n9qxt" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.371283 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.373197 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.373582 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.374294 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.381431 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.507260 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.507616 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.507671 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.507700 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.507752 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-logs\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.507779 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.507817 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-scripts\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.507846 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.507917 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.507959 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.507990 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.508013 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggcss\" (UniqueName: \"kubernetes.io/projected/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-kube-api-access-ggcss\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.508036 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.508074 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xhjk\" (UniqueName: \"kubernetes.io/projected/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-kube-api-access-9xhjk\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.508148 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-config-data\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.508179 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.610687 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.610776 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.610800 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.610817 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcss\" (UniqueName: \"kubernetes.io/projected/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-kube-api-access-ggcss\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611634 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611665 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xhjk\" (UniqueName: \"kubernetes.io/projected/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-kube-api-access-9xhjk\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611704 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-config-data\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611728 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611781 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611799 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611841 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611862 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611860 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611898 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-logs\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611922 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611949 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-scripts\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611973 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.611856 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.612758 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.613459 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.616552 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.618569 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-logs\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.620744 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.623367 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.632760 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.635071 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.637065 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.638163 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xhjk\" (UniqueName: \"kubernetes.io/projected/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-kube-api-access-9xhjk\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.640493 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-scripts\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.640531 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggcss\" (UniqueName: \"kubernetes.io/projected/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-kube-api-access-ggcss\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.642682 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.648636 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-config-data\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.679247 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.701664 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.733364 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.744744 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.842045 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.842167 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.948685 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:50 crc kubenswrapper[4873]: I0219 10:02:50.948756 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:02:51 crc kubenswrapper[4873]: I0219 10:02:51.042754 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 10:02:51 crc kubenswrapper[4873]: I0219 10:02:51.082679 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 19 10:02:51 crc kubenswrapper[4873]: I0219 10:02:51.205587 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 10:02:51 crc kubenswrapper[4873]: I0219 10:02:51.239758 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 19 10:02:51 crc kubenswrapper[4873]: I0219 10:02:51.291964 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 19 10:02:51 crc kubenswrapper[4873]: I0219 10:02:51.292042 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:02:51 crc kubenswrapper[4873]: I0219 10:02:51.328041 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 19 10:02:51 crc kubenswrapper[4873]: I0219 10:02:51.511382 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e3a8bf8-a885-4a50-97d2-53df598b1ce9" path="/var/lib/kubelet/pods/6e3a8bf8-a885-4a50-97d2-53df598b1ce9/volumes" Feb 19 10:02:51 crc kubenswrapper[4873]: I0219 10:02:51.512808 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efd66693-8b61-499e-a8c8-f8545b8fcced" path="/var/lib/kubelet/pods/efd66693-8b61-499e-a8c8-f8545b8fcced/volumes" Feb 19 10:02:52 crc kubenswrapper[4873]: I0219 10:02:52.239084 4873 generic.go:334] "Generic (PLEG): container finished" podID="58099bc8-1a29-467b-b13d-c0713e42e6c2" containerID="a196b181363d1056b517a86bc31a20f9a28399d782296cb45561ba646a621a77" exitCode=0 Feb 19 10:02:52 crc kubenswrapper[4873]: I0219 10:02:52.239198 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wrcpc" event={"ID":"58099bc8-1a29-467b-b13d-c0713e42e6c2","Type":"ContainerDied","Data":"a196b181363d1056b517a86bc31a20f9a28399d782296cb45561ba646a621a77"} Feb 19 10:02:52 crc kubenswrapper[4873]: I0219 10:02:52.283168 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 19 10:02:52 crc kubenswrapper[4873]: I0219 10:02:52.342468 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 10:02:53 crc kubenswrapper[4873]: I0219 10:02:53.191856 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 19 10:02:53 crc kubenswrapper[4873]: I0219 10:02:53.197028 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 19 10:02:53 crc kubenswrapper[4873]: I0219 10:02:53.249915 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="b8008736-31ec-491c-aa52-03b9413feab9" containerName="watcher-decision-engine" containerID="cri-o://a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149" gracePeriod=30 Feb 19 10:02:53 crc kubenswrapper[4873]: I0219 10:02:53.256639 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 10:02:53 crc kubenswrapper[4873]: I0219 10:02:53.790266 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:02:53 crc kubenswrapper[4873]: I0219 10:02:53.851971 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-99d6b5b4f-2j7fk"] Feb 19 10:02:53 crc kubenswrapper[4873]: I0219 10:02:53.857866 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" podUID="402372ed-3c0d-4d12-a4f5-bbd82024a08d" containerName="dnsmasq-dns" containerID="cri-o://7a12463c2cf197b1f920440df50985d94ae3e7a22c56ad882d01bc741d80d703" gracePeriod=10 Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.265473 4873 scope.go:117] "RemoveContainer" containerID="d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.272012 4873 generic.go:334] "Generic (PLEG): container finished" podID="402372ed-3c0d-4d12-a4f5-bbd82024a08d" containerID="7a12463c2cf197b1f920440df50985d94ae3e7a22c56ad882d01bc741d80d703" exitCode=0 Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.272195 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="95402218-fbbb-4453-aba6-d135ba3a26bd" containerName="watcher-applier" containerID="cri-o://99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed" gracePeriod=30 Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.272465 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" event={"ID":"402372ed-3c0d-4d12-a4f5-bbd82024a08d","Type":"ContainerDied","Data":"7a12463c2cf197b1f920440df50985d94ae3e7a22c56ad882d01bc741d80d703"} Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.360949 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.432330 4873 scope.go:117] "RemoveContainer" containerID="f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269" Feb 19 10:02:54 crc kubenswrapper[4873]: E0219 10:02:54.439647 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269\": container with ID starting with f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269 not found: ID does not exist" containerID="f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.439698 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269"} err="failed to get container status \"f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269\": rpc error: code = NotFound desc = could not find container \"f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269\": container with ID starting with f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269 not found: ID does not exist" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.439719 4873 scope.go:117] "RemoveContainer" containerID="d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784" Feb 19 10:02:54 crc kubenswrapper[4873]: E0219 10:02:54.446055 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784\": container with ID starting with d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784 not found: ID does not exist" containerID="d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.446341 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784"} err="failed to get container status \"d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784\": rpc error: code = NotFound desc = could not find container \"d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784\": container with ID starting with d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784 not found: ID does not exist" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.446361 4873 scope.go:117] "RemoveContainer" containerID="f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.450307 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269"} err="failed to get container status \"f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269\": rpc error: code = NotFound desc = could not find container \"f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269\": container with ID starting with f6480951dbeac0ab665a2aa5fd54473ab3ba4b503cfac0f509921fa0368f3269 not found: ID does not exist" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.450340 4873 scope.go:117] "RemoveContainer" containerID="d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.451011 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784"} err="failed to get container status \"d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784\": rpc error: code = NotFound desc = could not find container \"d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784\": container with ID starting with d2e4e37b9a2920a7efd73dedf273c3a7b41869d40b515eddd13310356cb89784 not found: ID does not exist" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.498683 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-credential-keys\") pod \"58099bc8-1a29-467b-b13d-c0713e42e6c2\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.498726 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-config-data\") pod \"58099bc8-1a29-467b-b13d-c0713e42e6c2\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.498831 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jctkh\" (UniqueName: \"kubernetes.io/projected/58099bc8-1a29-467b-b13d-c0713e42e6c2-kube-api-access-jctkh\") pod \"58099bc8-1a29-467b-b13d-c0713e42e6c2\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.498867 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-scripts\") pod \"58099bc8-1a29-467b-b13d-c0713e42e6c2\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.498901 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-fernet-keys\") pod \"58099bc8-1a29-467b-b13d-c0713e42e6c2\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.499004 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-combined-ca-bundle\") pod \"58099bc8-1a29-467b-b13d-c0713e42e6c2\" (UID: \"58099bc8-1a29-467b-b13d-c0713e42e6c2\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.506858 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "58099bc8-1a29-467b-b13d-c0713e42e6c2" (UID: "58099bc8-1a29-467b-b13d-c0713e42e6c2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.507270 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-scripts" (OuterVolumeSpecName: "scripts") pod "58099bc8-1a29-467b-b13d-c0713e42e6c2" (UID: "58099bc8-1a29-467b-b13d-c0713e42e6c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.517824 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58099bc8-1a29-467b-b13d-c0713e42e6c2-kube-api-access-jctkh" (OuterVolumeSpecName: "kube-api-access-jctkh") pod "58099bc8-1a29-467b-b13d-c0713e42e6c2" (UID: "58099bc8-1a29-467b-b13d-c0713e42e6c2"). InnerVolumeSpecName "kube-api-access-jctkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.546984 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "58099bc8-1a29-467b-b13d-c0713e42e6c2" (UID: "58099bc8-1a29-467b-b13d-c0713e42e6c2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.552044 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58099bc8-1a29-467b-b13d-c0713e42e6c2" (UID: "58099bc8-1a29-467b-b13d-c0713e42e6c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.592280 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-config-data" (OuterVolumeSpecName: "config-data") pod "58099bc8-1a29-467b-b13d-c0713e42e6c2" (UID: "58099bc8-1a29-467b-b13d-c0713e42e6c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.601366 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.601396 4873 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.601407 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.601415 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jctkh\" (UniqueName: \"kubernetes.io/projected/58099bc8-1a29-467b-b13d-c0713e42e6c2-kube-api-access-jctkh\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.601427 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.601437 4873 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58099bc8-1a29-467b-b13d-c0713e42e6c2-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.739053 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.905887 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-sb\") pod \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.905969 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-nb\") pod \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.905998 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vchwr\" (UniqueName: \"kubernetes.io/projected/402372ed-3c0d-4d12-a4f5-bbd82024a08d-kube-api-access-vchwr\") pod \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.906067 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-svc\") pod \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.906123 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-swift-storage-0\") pod \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.906176 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-config\") pod \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\" (UID: \"402372ed-3c0d-4d12-a4f5-bbd82024a08d\") " Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.910129 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402372ed-3c0d-4d12-a4f5-bbd82024a08d-kube-api-access-vchwr" (OuterVolumeSpecName: "kube-api-access-vchwr") pod "402372ed-3c0d-4d12-a4f5-bbd82024a08d" (UID: "402372ed-3c0d-4d12-a4f5-bbd82024a08d"). InnerVolumeSpecName "kube-api-access-vchwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.951230 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "402372ed-3c0d-4d12-a4f5-bbd82024a08d" (UID: "402372ed-3c0d-4d12-a4f5-bbd82024a08d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.960195 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "402372ed-3c0d-4d12-a4f5-bbd82024a08d" (UID: "402372ed-3c0d-4d12-a4f5-bbd82024a08d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.962698 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "402372ed-3c0d-4d12-a4f5-bbd82024a08d" (UID: "402372ed-3c0d-4d12-a4f5-bbd82024a08d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.967337 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-config" (OuterVolumeSpecName: "config") pod "402372ed-3c0d-4d12-a4f5-bbd82024a08d" (UID: "402372ed-3c0d-4d12-a4f5-bbd82024a08d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.981390 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "402372ed-3c0d-4d12-a4f5-bbd82024a08d" (UID: "402372ed-3c0d-4d12-a4f5-bbd82024a08d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:02:54 crc kubenswrapper[4873]: I0219 10:02:54.989445 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:02:54 crc kubenswrapper[4873]: W0219 10:02:54.993307 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3b30a9_f42d_4ac8_a0d0_9c03d0071c7a.slice/crio-27dd31c2ce043db502b2393bca38366353aaef589f4415a46def82efc00bdbd7 WatchSource:0}: Error finding container 27dd31c2ce043db502b2393bca38366353aaef589f4415a46def82efc00bdbd7: Status 404 returned error can't find the container with id 27dd31c2ce043db502b2393bca38366353aaef589f4415a46def82efc00bdbd7 Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.008676 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.008713 4873 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.008726 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.008736 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.008746 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/402372ed-3c0d-4d12-a4f5-bbd82024a08d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.008754 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vchwr\" (UniqueName: \"kubernetes.io/projected/402372ed-3c0d-4d12-a4f5-bbd82024a08d-kube-api-access-vchwr\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.088787 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:02:55 crc kubenswrapper[4873]: W0219 10:02:55.095249 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1da8b72b_fdc0_4c00_a1da_cdb5e8e04e8e.slice/crio-d67a114aa214902359c1e29c718493f6dd023a96ca4f5d6261a47eb5f1d136c6 WatchSource:0}: Error finding container d67a114aa214902359c1e29c718493f6dd023a96ca4f5d6261a47eb5f1d136c6: Status 404 returned error can't find the container with id d67a114aa214902359c1e29c718493f6dd023a96ca4f5d6261a47eb5f1d136c6 Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.308623 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a","Type":"ContainerStarted","Data":"27dd31c2ce043db502b2393bca38366353aaef589f4415a46def82efc00bdbd7"} Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.323718 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wrcpc" event={"ID":"58099bc8-1a29-467b-b13d-c0713e42e6c2","Type":"ContainerDied","Data":"ed07e84b25a983ea418f7869298191a80a5dc7f605f24c8aefa1a6f1b3d88cd5"} Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.323756 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed07e84b25a983ea418f7869298191a80a5dc7f605f24c8aefa1a6f1b3d88cd5" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.323777 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wrcpc" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.350412 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" event={"ID":"402372ed-3c0d-4d12-a4f5-bbd82024a08d","Type":"ContainerDied","Data":"676e73049674f4406ff08bc00e5c61a6ce15ece9c685ed8a76d9fda336863789"} Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.350484 4873 scope.go:117] "RemoveContainer" containerID="7a12463c2cf197b1f920440df50985d94ae3e7a22c56ad882d01bc741d80d703" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.351186 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99d6b5b4f-2j7fk" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.353738 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e","Type":"ContainerStarted","Data":"d67a114aa214902359c1e29c718493f6dd023a96ca4f5d6261a47eb5f1d136c6"} Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.370866 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab448dfd-a67c-49b5-a153-92a5a6f504b2","Type":"ContainerStarted","Data":"c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1"} Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.389666 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-99d6b5b4f-2j7fk"] Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.403255 4873 scope.go:117] "RemoveContainer" containerID="0caa3e8656105d67ae98953c7c54ce1e536f9e27f2d0305163026fbf53ca79e0" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.414166 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-99d6b5b4f-2j7fk"] Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.507714 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402372ed-3c0d-4d12-a4f5-bbd82024a08d" path="/var/lib/kubelet/pods/402372ed-3c0d-4d12-a4f5-bbd82024a08d/volumes" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.508426 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5fcd445c48-xvpw4"] Feb 19 10:02:55 crc kubenswrapper[4873]: E0219 10:02:55.510278 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402372ed-3c0d-4d12-a4f5-bbd82024a08d" containerName="init" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.510304 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="402372ed-3c0d-4d12-a4f5-bbd82024a08d" containerName="init" Feb 19 10:02:55 crc kubenswrapper[4873]: E0219 10:02:55.510326 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402372ed-3c0d-4d12-a4f5-bbd82024a08d" containerName="dnsmasq-dns" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.510336 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="402372ed-3c0d-4d12-a4f5-bbd82024a08d" containerName="dnsmasq-dns" Feb 19 10:02:55 crc kubenswrapper[4873]: E0219 10:02:55.510350 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58099bc8-1a29-467b-b13d-c0713e42e6c2" containerName="keystone-bootstrap" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.510358 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="58099bc8-1a29-467b-b13d-c0713e42e6c2" containerName="keystone-bootstrap" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.510630 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="402372ed-3c0d-4d12-a4f5-bbd82024a08d" containerName="dnsmasq-dns" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.510670 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="58099bc8-1a29-467b-b13d-c0713e42e6c2" containerName="keystone-bootstrap" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.511433 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fcd445c48-xvpw4"] Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.511552 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.524649 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.524915 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.525780 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-27d74" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.526152 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.526392 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.530095 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.619421 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-fernet-keys\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.619473 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-config-data\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.619596 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-internal-tls-certs\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.619628 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-public-tls-certs\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.619666 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g25z\" (UniqueName: \"kubernetes.io/projected/ed86f09e-909d-451b-96c0-9b4b7b27eb03-kube-api-access-8g25z\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.619690 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-combined-ca-bundle\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.619708 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-credential-keys\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.619726 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-scripts\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.723562 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-internal-tls-certs\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.723643 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-public-tls-certs\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.723674 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g25z\" (UniqueName: \"kubernetes.io/projected/ed86f09e-909d-451b-96c0-9b4b7b27eb03-kube-api-access-8g25z\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.723712 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-combined-ca-bundle\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.723736 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-credential-keys\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.723758 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-scripts\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.723830 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-fernet-keys\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.723857 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-config-data\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.730578 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-combined-ca-bundle\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.740278 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-public-tls-certs\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.740945 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-internal-tls-certs\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.743084 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-credential-keys\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.743336 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-fernet-keys\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.743447 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-config-data\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.743698 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g25z\" (UniqueName: \"kubernetes.io/projected/ed86f09e-909d-451b-96c0-9b4b7b27eb03-kube-api-access-8g25z\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.756239 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed86f09e-909d-451b-96c0-9b4b7b27eb03-scripts\") pod \"keystone-5fcd445c48-xvpw4\" (UID: \"ed86f09e-909d-451b-96c0-9b4b7b27eb03\") " pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:55 crc kubenswrapper[4873]: I0219 10:02:55.844908 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.187061 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 10:02:56 crc kubenswrapper[4873]: E0219 10:02:56.298767 4873 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed is running failed: container process not found" containerID="99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 10:02:56 crc kubenswrapper[4873]: E0219 10:02:56.299147 4873 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed is running failed: container process not found" containerID="99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 10:02:56 crc kubenswrapper[4873]: E0219 10:02:56.299666 4873 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed is running failed: container process not found" containerID="99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Feb 19 10:02:56 crc kubenswrapper[4873]: E0219 10:02:56.299705 4873 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="95402218-fbbb-4453-aba6-d135ba3a26bd" containerName="watcher-applier" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.339396 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-config-data\") pod \"b8008736-31ec-491c-aa52-03b9413feab9\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.339518 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-combined-ca-bundle\") pod \"b8008736-31ec-491c-aa52-03b9413feab9\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.339553 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-custom-prometheus-ca\") pod \"b8008736-31ec-491c-aa52-03b9413feab9\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.339630 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf92r\" (UniqueName: \"kubernetes.io/projected/b8008736-31ec-491c-aa52-03b9413feab9-kube-api-access-cf92r\") pod \"b8008736-31ec-491c-aa52-03b9413feab9\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.339791 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8008736-31ec-491c-aa52-03b9413feab9-logs\") pod \"b8008736-31ec-491c-aa52-03b9413feab9\" (UID: \"b8008736-31ec-491c-aa52-03b9413feab9\") " Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.347359 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8008736-31ec-491c-aa52-03b9413feab9-logs" (OuterVolumeSpecName: "logs") pod "b8008736-31ec-491c-aa52-03b9413feab9" (UID: "b8008736-31ec-491c-aa52-03b9413feab9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.348985 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8008736-31ec-491c-aa52-03b9413feab9-kube-api-access-cf92r" (OuterVolumeSpecName: "kube-api-access-cf92r") pod "b8008736-31ec-491c-aa52-03b9413feab9" (UID: "b8008736-31ec-491c-aa52-03b9413feab9"). InnerVolumeSpecName "kube-api-access-cf92r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.399893 4873 generic.go:334] "Generic (PLEG): container finished" podID="95402218-fbbb-4453-aba6-d135ba3a26bd" containerID="99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed" exitCode=0 Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.399967 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"95402218-fbbb-4453-aba6-d135ba3a26bd","Type":"ContainerDied","Data":"99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed"} Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.401895 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a","Type":"ContainerStarted","Data":"69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0"} Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.409913 4873 generic.go:334] "Generic (PLEG): container finished" podID="ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" containerID="ad115d69dacdb41d674a33e1db809c1ccc6821733d5a2f7e41e2ae5cb63809b4" exitCode=0 Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.409953 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-98gbw" event={"ID":"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6","Type":"ContainerDied","Data":"ad115d69dacdb41d674a33e1db809c1ccc6821733d5a2f7e41e2ae5cb63809b4"} Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.433552 4873 generic.go:334] "Generic (PLEG): container finished" podID="b8008736-31ec-491c-aa52-03b9413feab9" containerID="a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149" exitCode=1 Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.433614 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b8008736-31ec-491c-aa52-03b9413feab9","Type":"ContainerDied","Data":"a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149"} Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.433647 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b8008736-31ec-491c-aa52-03b9413feab9","Type":"ContainerDied","Data":"a2ea00441668ccb8b861c9df038d7b7675d321f71184b1fb21464b11d16d7eef"} Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.433665 4873 scope.go:117] "RemoveContainer" containerID="a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.433686 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.438512 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e","Type":"ContainerStarted","Data":"00e17aa3c77a8dac057b0211e38bac6faa2ba84727374dd1e7825f5c8cd0363b"} Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.440086 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fcd445c48-xvpw4"] Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.445158 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf92r\" (UniqueName: \"kubernetes.io/projected/b8008736-31ec-491c-aa52-03b9413feab9-kube-api-access-cf92r\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.445184 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8008736-31ec-491c-aa52-03b9413feab9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.459399 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b8008736-31ec-491c-aa52-03b9413feab9" (UID: "b8008736-31ec-491c-aa52-03b9413feab9"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.489418 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8008736-31ec-491c-aa52-03b9413feab9" (UID: "b8008736-31ec-491c-aa52-03b9413feab9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.517252 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-config-data" (OuterVolumeSpecName: "config-data") pod "b8008736-31ec-491c-aa52-03b9413feab9" (UID: "b8008736-31ec-491c-aa52-03b9413feab9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.550284 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.550312 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.550324 4873 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b8008736-31ec-491c-aa52-03b9413feab9-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.629253 4873 scope.go:117] "RemoveContainer" containerID="a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149" Feb 19 10:02:56 crc kubenswrapper[4873]: E0219 10:02:56.630044 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149\": container with ID starting with a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149 not found: ID does not exist" containerID="a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.630084 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149"} err="failed to get container status \"a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149\": rpc error: code = NotFound desc = could not find container \"a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149\": container with ID starting with a99eef84ecab86eece9faca56461f62ff2124a5b8881a9de7669714effc8b149 not found: ID does not exist" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.702699 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.773085 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.801182 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.812034 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:02:56 crc kubenswrapper[4873]: E0219 10:02:56.812681 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8008736-31ec-491c-aa52-03b9413feab9" containerName="watcher-decision-engine" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.812707 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8008736-31ec-491c-aa52-03b9413feab9" containerName="watcher-decision-engine" Feb 19 10:02:56 crc kubenswrapper[4873]: E0219 10:02:56.812734 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95402218-fbbb-4453-aba6-d135ba3a26bd" containerName="watcher-applier" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.812740 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="95402218-fbbb-4453-aba6-d135ba3a26bd" containerName="watcher-applier" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.812897 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8008736-31ec-491c-aa52-03b9413feab9" containerName="watcher-decision-engine" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.812920 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="95402218-fbbb-4453-aba6-d135ba3a26bd" containerName="watcher-applier" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.813502 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.816433 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.841161 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.854503 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95402218-fbbb-4453-aba6-d135ba3a26bd-logs\") pod \"95402218-fbbb-4453-aba6-d135ba3a26bd\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.854636 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-combined-ca-bundle\") pod \"95402218-fbbb-4453-aba6-d135ba3a26bd\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.854696 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b9v6\" (UniqueName: \"kubernetes.io/projected/95402218-fbbb-4453-aba6-d135ba3a26bd-kube-api-access-4b9v6\") pod \"95402218-fbbb-4453-aba6-d135ba3a26bd\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.854759 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-config-data\") pod \"95402218-fbbb-4453-aba6-d135ba3a26bd\" (UID: \"95402218-fbbb-4453-aba6-d135ba3a26bd\") " Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.854929 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95402218-fbbb-4453-aba6-d135ba3a26bd-logs" (OuterVolumeSpecName: "logs") pod "95402218-fbbb-4453-aba6-d135ba3a26bd" (UID: "95402218-fbbb-4453-aba6-d135ba3a26bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.855301 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95402218-fbbb-4453-aba6-d135ba3a26bd-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.859164 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95402218-fbbb-4453-aba6-d135ba3a26bd-kube-api-access-4b9v6" (OuterVolumeSpecName: "kube-api-access-4b9v6") pod "95402218-fbbb-4453-aba6-d135ba3a26bd" (UID: "95402218-fbbb-4453-aba6-d135ba3a26bd"). InnerVolumeSpecName "kube-api-access-4b9v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.914609 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95402218-fbbb-4453-aba6-d135ba3a26bd" (UID: "95402218-fbbb-4453-aba6-d135ba3a26bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.960784 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzrl\" (UniqueName: \"kubernetes.io/projected/ab7f7779-d6dd-4844-8af5-83ade972d9d0-kube-api-access-6mzrl\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.961030 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.961160 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab7f7779-d6dd-4844-8af5-83ade972d9d0-logs\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.961381 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.961542 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-config-data\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.963901 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.963928 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b9v6\" (UniqueName: \"kubernetes.io/projected/95402218-fbbb-4453-aba6-d135ba3a26bd-kube-api-access-4b9v6\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:56 crc kubenswrapper[4873]: I0219 10:02:56.981336 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-config-data" (OuterVolumeSpecName: "config-data") pod "95402218-fbbb-4453-aba6-d135ba3a26bd" (UID: "95402218-fbbb-4453-aba6-d135ba3a26bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.065298 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-config-data\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.065435 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzrl\" (UniqueName: \"kubernetes.io/projected/ab7f7779-d6dd-4844-8af5-83ade972d9d0-kube-api-access-6mzrl\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.065472 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.065513 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab7f7779-d6dd-4844-8af5-83ade972d9d0-logs\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.065572 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.065647 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95402218-fbbb-4453-aba6-d135ba3a26bd-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.066380 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab7f7779-d6dd-4844-8af5-83ade972d9d0-logs\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.070228 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.070628 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-config-data\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.070716 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.087095 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.087647 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzrl\" (UniqueName: \"kubernetes.io/projected/ab7f7779-d6dd-4844-8af5-83ade972d9d0-kube-api-access-6mzrl\") pod \"watcher-decision-engine-0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.089602 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerName="watcher-api-log" containerID="cri-o://cd1c8470e0eda76a81d6e96f1fc492dbb2365558bb9cd6d4a1af5b28681c31e0" gracePeriod=30 Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.090010 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerName="watcher-api" containerID="cri-o://63ad458896e287b047bf524f55c92f7c8b727110e2a9fed53b0acb11b2c99b24" gracePeriod=30 Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.134536 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.508058 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8008736-31ec-491c-aa52-03b9413feab9" path="/var/lib/kubelet/pods/b8008736-31ec-491c-aa52-03b9413feab9/volumes" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.509929 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e","Type":"ContainerStarted","Data":"4ba322d3698975ce137f4a01e18bf101e9ef8127a707662eba0de8fcfb0c7ffe"} Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.527240 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"95402218-fbbb-4453-aba6-d135ba3a26bd","Type":"ContainerDied","Data":"db54f1f84baadc90d4260a330bf2f720ca4cd24fbaeefa2f3e5f18f033d41844"} Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.527271 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.527746 4873 scope.go:117] "RemoveContainer" containerID="99300e7b247193a93a473310bd314f670d8dad13e2e93b1e553f7e8d446453ed" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.539993 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a","Type":"ContainerStarted","Data":"78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4"} Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.547940 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fcd445c48-xvpw4" event={"ID":"ed86f09e-909d-451b-96c0-9b4b7b27eb03","Type":"ContainerStarted","Data":"69a430e9811337d1d7781794bfb96f803c68ee550b820556087a0414b2457040"} Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.547980 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fcd445c48-xvpw4" event={"ID":"ed86f09e-909d-451b-96c0-9b4b7b27eb03","Type":"ContainerStarted","Data":"f37425886de9ea059884eff8e220b6be7a28cb06584c590c19ebb7a83a84c7d0"} Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.549870 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.556632 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.556614243 podStartE2EDuration="7.556614243s" podCreationTimestamp="2026-02-19 10:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:57.545503397 +0000 UTC m=+1086.834935025" watchObservedRunningTime="2026-02-19 10:02:57.556614243 +0000 UTC m=+1086.846045881" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.581139 4873 generic.go:334] "Generic (PLEG): container finished" podID="943d069e-6ad4-4411-b937-c4499f0ced6f" containerID="e48e8c3f3cd0f5266c11f4e70aa13217161d56e20c834665683a06bd7308e111" exitCode=0 Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.581225 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4pv5z" event={"ID":"943d069e-6ad4-4411-b937-c4499f0ced6f","Type":"ContainerDied","Data":"e48e8c3f3cd0f5266c11f4e70aa13217161d56e20c834665683a06bd7308e111"} Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.590129 4873 generic.go:334] "Generic (PLEG): container finished" podID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerID="cd1c8470e0eda76a81d6e96f1fc492dbb2365558bb9cd6d4a1af5b28681c31e0" exitCode=143 Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.590320 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2","Type":"ContainerDied","Data":"cd1c8470e0eda76a81d6e96f1fc492dbb2365558bb9cd6d4a1af5b28681c31e0"} Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.601004 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.600979603 podStartE2EDuration="7.600979603s" podCreationTimestamp="2026-02-19 10:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:57.578200808 +0000 UTC m=+1086.867632446" watchObservedRunningTime="2026-02-19 10:02:57.600979603 +0000 UTC m=+1086.890411241" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.612001 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5fcd445c48-xvpw4" podStartSLOduration=2.611983316 podStartE2EDuration="2.611983316s" podCreationTimestamp="2026-02-19 10:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:57.596508692 +0000 UTC m=+1086.885940330" watchObservedRunningTime="2026-02-19 10:02:57.611983316 +0000 UTC m=+1086.901414954" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.628360 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 10:02:57 crc kubenswrapper[4873]: E0219 10:02:57.643839 4873 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95402218_fbbb_4453_aba6_d135ba3a26bd.slice/crio-db54f1f84baadc90d4260a330bf2f720ca4cd24fbaeefa2f3e5f18f033d41844\": RecentStats: unable to find data in memory cache]" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.651494 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.662037 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.663201 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.677219 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.681206 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.689083 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.785498 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0e231c-7848-4f57-a28b-dfec3c87b617-config-data\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.785625 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0e231c-7848-4f57-a28b-dfec3c87b617-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.785702 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz8q8\" (UniqueName: \"kubernetes.io/projected/3d0e231c-7848-4f57-a28b-dfec3c87b617-kube-api-access-zz8q8\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.785750 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d0e231c-7848-4f57-a28b-dfec3c87b617-logs\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.887256 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0e231c-7848-4f57-a28b-dfec3c87b617-config-data\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.887654 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0e231c-7848-4f57-a28b-dfec3c87b617-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.887736 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz8q8\" (UniqueName: \"kubernetes.io/projected/3d0e231c-7848-4f57-a28b-dfec3c87b617-kube-api-access-zz8q8\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.887773 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d0e231c-7848-4f57-a28b-dfec3c87b617-logs\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.889030 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d0e231c-7848-4f57-a28b-dfec3c87b617-logs\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.896422 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0e231c-7848-4f57-a28b-dfec3c87b617-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.897736 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0e231c-7848-4f57-a28b-dfec3c87b617-config-data\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.908343 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz8q8\" (UniqueName: \"kubernetes.io/projected/3d0e231c-7848-4f57-a28b-dfec3c87b617-kube-api-access-zz8q8\") pod \"watcher-applier-0\" (UID: \"3d0e231c-7848-4f57-a28b-dfec3c87b617\") " pod="openstack/watcher-applier-0" Feb 19 10:02:57 crc kubenswrapper[4873]: I0219 10:02:57.967954 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.090511 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9t4k\" (UniqueName: \"kubernetes.io/projected/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-kube-api-access-w9t4k\") pod \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.090601 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-scripts\") pod \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.090702 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-logs\") pod \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.090872 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-config-data\") pod \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.090922 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-combined-ca-bundle\") pod \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\" (UID: \"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6\") " Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.091477 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-logs" (OuterVolumeSpecName: "logs") pod "ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" (UID: "ec5489a2-23e2-4875-a19b-d15b4ad6c8c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.095307 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-scripts" (OuterVolumeSpecName: "scripts") pod "ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" (UID: "ec5489a2-23e2-4875-a19b-d15b4ad6c8c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.096318 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-kube-api-access-w9t4k" (OuterVolumeSpecName: "kube-api-access-w9t4k") pod "ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" (UID: "ec5489a2-23e2-4875-a19b-d15b4ad6c8c6"). InnerVolumeSpecName "kube-api-access-w9t4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.118279 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.121491 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" (UID: "ec5489a2-23e2-4875-a19b-d15b4ad6c8c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.122064 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-config-data" (OuterVolumeSpecName: "config-data") pod "ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" (UID: "ec5489a2-23e2-4875-a19b-d15b4ad6c8c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.193047 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.193377 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.193391 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9t4k\" (UniqueName: \"kubernetes.io/projected/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-kube-api-access-w9t4k\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.193402 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.194343 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.229731 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.163:9322/\": read tcp 10.217.0.2:42954->10.217.0.163:9322: read: connection reset by peer" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.230034 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9322/\": read tcp 10.217.0.2:42950->10.217.0.163:9322: read: connection reset by peer" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.561351 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6696d67b98-wrvnm"] Feb 19 10:02:58 crc kubenswrapper[4873]: E0219 10:02:58.561790 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" containerName="placement-db-sync" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.561806 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" containerName="placement-db-sync" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.562022 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" containerName="placement-db-sync" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.563257 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.566632 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.566832 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.577720 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6696d67b98-wrvnm"] Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.600561 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67vtn\" (UniqueName: \"kubernetes.io/projected/c5d4dde9-793b-403e-8701-84cca6a509e1-kube-api-access-67vtn\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.600614 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-scripts\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.600647 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-internal-tls-certs\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.600674 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-config-data\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.600705 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-combined-ca-bundle\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.600751 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d4dde9-793b-403e-8701-84cca6a509e1-logs\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.600766 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-public-tls-certs\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.604495 4873 generic.go:334] "Generic (PLEG): container finished" podID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerID="63ad458896e287b047bf524f55c92f7c8b727110e2a9fed53b0acb11b2c99b24" exitCode=0 Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.604691 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2","Type":"ContainerDied","Data":"63ad458896e287b047bf524f55c92f7c8b727110e2a9fed53b0acb11b2c99b24"} Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.635780 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"ab7f7779-d6dd-4844-8af5-83ade972d9d0","Type":"ContainerStarted","Data":"a62f2b1b9a301f2ebebd7bf5613870ec1c3fc6e4830ec22341716d60b02e765d"} Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.635820 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"ab7f7779-d6dd-4844-8af5-83ade972d9d0","Type":"ContainerStarted","Data":"6715424b51c6df78b1881817986335974e70067799bdff519c5527858f40bf0f"} Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.655600 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-98gbw" event={"ID":"ec5489a2-23e2-4875-a19b-d15b4ad6c8c6","Type":"ContainerDied","Data":"37b0ac3d48e8bec4044f6f8f22d9abb8c79cc58cedf7d1bbf6b0fb89fcc2a84a"} Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.655641 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37b0ac3d48e8bec4044f6f8f22d9abb8c79cc58cedf7d1bbf6b0fb89fcc2a84a" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.655674 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-98gbw" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.656676 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.656659234 podStartE2EDuration="2.656659234s" podCreationTimestamp="2026-02-19 10:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:58.654377047 +0000 UTC m=+1087.943808685" watchObservedRunningTime="2026-02-19 10:02:58.656659234 +0000 UTC m=+1087.946090872" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.702954 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-scripts\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.711405 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-scripts\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.703295 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-internal-tls-certs\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.714630 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-config-data\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.714733 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-combined-ca-bundle\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.714774 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d4dde9-793b-403e-8701-84cca6a509e1-logs\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.714790 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-public-tls-certs\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.715132 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67vtn\" (UniqueName: \"kubernetes.io/projected/c5d4dde9-793b-403e-8701-84cca6a509e1-kube-api-access-67vtn\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.719803 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-internal-tls-certs\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.724362 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d4dde9-793b-403e-8701-84cca6a509e1-logs\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.725307 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-combined-ca-bundle\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.728247 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-public-tls-certs\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.737924 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d4dde9-793b-403e-8701-84cca6a509e1-config-data\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.744727 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67vtn\" (UniqueName: \"kubernetes.io/projected/c5d4dde9-793b-403e-8701-84cca6a509e1-kube-api-access-67vtn\") pod \"placement-6696d67b98-wrvnm\" (UID: \"c5d4dde9-793b-403e-8701-84cca6a509e1\") " pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.772071 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.859274 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.888301 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.923923 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-config-data\") pod \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.924017 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrwmp\" (UniqueName: \"kubernetes.io/projected/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-kube-api-access-jrwmp\") pod \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.924039 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-custom-prometheus-ca\") pod \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.924187 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-combined-ca-bundle\") pod \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.924299 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-logs\") pod \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\" (UID: \"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2\") " Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.925738 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-logs" (OuterVolumeSpecName: "logs") pod "de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" (UID: "de2aeb33-bbce-4b15-a2c4-ce80764ef0c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:02:58 crc kubenswrapper[4873]: I0219 10:02:58.950271 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-kube-api-access-jrwmp" (OuterVolumeSpecName: "kube-api-access-jrwmp") pod "de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" (UID: "de2aeb33-bbce-4b15-a2c4-ce80764ef0c2"). InnerVolumeSpecName "kube-api-access-jrwmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.026972 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrwmp\" (UniqueName: \"kubernetes.io/projected/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-kube-api-access-jrwmp\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.027006 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.039324 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" (UID: "de2aeb33-bbce-4b15-a2c4-ce80764ef0c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.074248 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" (UID: "de2aeb33-bbce-4b15-a2c4-ce80764ef0c2"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.077333 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-config-data" (OuterVolumeSpecName: "config-data") pod "de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" (UID: "de2aeb33-bbce-4b15-a2c4-ce80764ef0c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.090544 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.127978 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78b5v\" (UniqueName: \"kubernetes.io/projected/943d069e-6ad4-4411-b937-c4499f0ced6f-kube-api-access-78b5v\") pod \"943d069e-6ad4-4411-b937-c4499f0ced6f\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.128047 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-db-sync-config-data\") pod \"943d069e-6ad4-4411-b937-c4499f0ced6f\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.128117 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-combined-ca-bundle\") pod \"943d069e-6ad4-4411-b937-c4499f0ced6f\" (UID: \"943d069e-6ad4-4411-b937-c4499f0ced6f\") " Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.129320 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.129348 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.129360 4873 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.148760 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943d069e-6ad4-4411-b937-c4499f0ced6f-kube-api-access-78b5v" (OuterVolumeSpecName: "kube-api-access-78b5v") pod "943d069e-6ad4-4411-b937-c4499f0ced6f" (UID: "943d069e-6ad4-4411-b937-c4499f0ced6f"). InnerVolumeSpecName "kube-api-access-78b5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.149056 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "943d069e-6ad4-4411-b937-c4499f0ced6f" (UID: "943d069e-6ad4-4411-b937-c4499f0ced6f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.160549 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "943d069e-6ad4-4411-b937-c4499f0ced6f" (UID: "943d069e-6ad4-4411-b937-c4499f0ced6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.231236 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78b5v\" (UniqueName: \"kubernetes.io/projected/943d069e-6ad4-4411-b937-c4499f0ced6f-kube-api-access-78b5v\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.231264 4873 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.231273 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943d069e-6ad4-4411-b937-c4499f0ced6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.451302 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6696d67b98-wrvnm"] Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.510722 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95402218-fbbb-4453-aba6-d135ba3a26bd" path="/var/lib/kubelet/pods/95402218-fbbb-4453-aba6-d135ba3a26bd/volumes" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.683707 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"3d0e231c-7848-4f57-a28b-dfec3c87b617","Type":"ContainerStarted","Data":"b3d36f077efd4110d6b89432fd6d1d4edec15caa67228502bc5035f6a21b3517"} Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.683750 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"3d0e231c-7848-4f57-a28b-dfec3c87b617","Type":"ContainerStarted","Data":"f08d44f6d42fcc54a7c13ee85c8b5e428b7cfde82ddce8291971b4f378eefbec"} Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.688869 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6696d67b98-wrvnm" event={"ID":"c5d4dde9-793b-403e-8701-84cca6a509e1","Type":"ContainerStarted","Data":"6dfe316e1d4a830fcd595802df839bbc29f0e446bb5bc6d014b82d92b16ddd5d"} Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.717663 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4pv5z" event={"ID":"943d069e-6ad4-4411-b937-c4499f0ced6f","Type":"ContainerDied","Data":"db07d7286194b278e2cec929f66edc47c3ebbe39668738c5d112ac9e99a6a103"} Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.717712 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db07d7286194b278e2cec929f66edc47c3ebbe39668738c5d112ac9e99a6a103" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.717790 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4pv5z" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.721163 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.7211395830000003 podStartE2EDuration="2.721139583s" podCreationTimestamp="2026-02-19 10:02:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:02:59.70612579 +0000 UTC m=+1088.995557438" watchObservedRunningTime="2026-02-19 10:02:59.721139583 +0000 UTC m=+1089.010571221" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.731636 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"de2aeb33-bbce-4b15-a2c4-ce80764ef0c2","Type":"ContainerDied","Data":"5c219368094a8a1c527a293923d373b84ac19d1c24f32f1ef5514bf9fa1b2437"} Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.731781 4873 scope.go:117] "RemoveContainer" containerID="63ad458896e287b047bf524f55c92f7c8b727110e2a9fed53b0acb11b2c99b24" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.732618 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.810207 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.823003 4873 scope.go:117] "RemoveContainer" containerID="cd1c8470e0eda76a81d6e96f1fc492dbb2365558bb9cd6d4a1af5b28681c31e0" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.856225 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.878711 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:59 crc kubenswrapper[4873]: E0219 10:02:59.879143 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943d069e-6ad4-4411-b937-c4499f0ced6f" containerName="barbican-db-sync" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.879159 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="943d069e-6ad4-4411-b937-c4499f0ced6f" containerName="barbican-db-sync" Feb 19 10:02:59 crc kubenswrapper[4873]: E0219 10:02:59.879171 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerName="watcher-api-log" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.879177 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerName="watcher-api-log" Feb 19 10:02:59 crc kubenswrapper[4873]: E0219 10:02:59.879190 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerName="watcher-api" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.879196 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerName="watcher-api" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.879401 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="943d069e-6ad4-4411-b937-c4499f0ced6f" containerName="barbican-db-sync" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.879421 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerName="watcher-api-log" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.879441 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" containerName="watcher-api" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.880386 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.889692 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.890168 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.890327 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.906191 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.961061 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-config-data\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.961172 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.961292 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-public-tls-certs\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.961426 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnc4h\" (UniqueName: \"kubernetes.io/projected/9fb835f9-7ac4-4212-a372-b793c2fb8afd-kube-api-access-qnc4h\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.961476 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.961569 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.961612 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fb835f9-7ac4-4212-a372-b793c2fb8afd-logs\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.973384 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-667444df98-tdgw9"] Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.974839 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.980793 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.982415 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.982628 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-t72rv" Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.994176 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-596d5556df-fx4q8"] Feb 19 10:02:59 crc kubenswrapper[4873]: I0219 10:02:59.996066 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.002902 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.015017 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-667444df98-tdgw9"] Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.051173 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-596d5556df-fx4q8"] Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063305 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc48b70c-5ab9-4765-a8cd-5985a3f63854-config-data-custom\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063354 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-public-tls-certs\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063396 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt6c6\" (UniqueName: \"kubernetes.io/projected/fc48b70c-5ab9-4765-a8cd-5985a3f63854-kube-api-access-nt6c6\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063430 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnc4h\" (UniqueName: \"kubernetes.io/projected/9fb835f9-7ac4-4212-a372-b793c2fb8afd-kube-api-access-qnc4h\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063453 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063469 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be5e1ee-a214-46ca-a5bf-d1d337848085-combined-ca-bundle\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063495 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be5e1ee-a214-46ca-a5bf-d1d337848085-logs\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063510 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be5e1ee-a214-46ca-a5bf-d1d337848085-config-data\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063523 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9be5e1ee-a214-46ca-a5bf-d1d337848085-config-data-custom\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063552 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063577 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fb835f9-7ac4-4212-a372-b793c2fb8afd-logs\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063597 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc48b70c-5ab9-4765-a8cd-5985a3f63854-combined-ca-bundle\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063634 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc48b70c-5ab9-4765-a8cd-5985a3f63854-logs\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063653 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-config-data\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063668 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063692 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc48b70c-5ab9-4765-a8cd-5985a3f63854-config-data\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.063711 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlqvf\" (UniqueName: \"kubernetes.io/projected/9be5e1ee-a214-46ca-a5bf-d1d337848085-kube-api-access-rlqvf\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.069410 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55dcd76767-7nrrt"] Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.071091 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.072720 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fb835f9-7ac4-4212-a372-b793c2fb8afd-logs\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.073161 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-public-tls-certs\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.087690 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.096738 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.097038 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55dcd76767-7nrrt"] Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.097298 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-config-data\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.100412 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnc4h\" (UniqueName: \"kubernetes.io/projected/9fb835f9-7ac4-4212-a372-b793c2fb8afd-kube-api-access-qnc4h\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.102002 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb835f9-7ac4-4212-a372-b793c2fb8afd-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9fb835f9-7ac4-4212-a372-b793c2fb8afd\") " pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165606 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt6c6\" (UniqueName: \"kubernetes.io/projected/fc48b70c-5ab9-4765-a8cd-5985a3f63854-kube-api-access-nt6c6\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165680 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be5e1ee-a214-46ca-a5bf-d1d337848085-combined-ca-bundle\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165711 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be5e1ee-a214-46ca-a5bf-d1d337848085-logs\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165727 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be5e1ee-a214-46ca-a5bf-d1d337848085-config-data\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165743 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9be5e1ee-a214-46ca-a5bf-d1d337848085-config-data-custom\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165774 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-svc\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165823 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc48b70c-5ab9-4765-a8cd-5985a3f63854-combined-ca-bundle\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165841 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-swift-storage-0\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165865 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-sb\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165887 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-config\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165902 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-nb\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165931 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc48b70c-5ab9-4765-a8cd-5985a3f63854-logs\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165963 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc48b70c-5ab9-4765-a8cd-5985a3f63854-config-data\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.165984 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlqvf\" (UniqueName: \"kubernetes.io/projected/9be5e1ee-a214-46ca-a5bf-d1d337848085-kube-api-access-rlqvf\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.166024 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc48b70c-5ab9-4765-a8cd-5985a3f63854-config-data-custom\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.166075 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbv6s\" (UniqueName: \"kubernetes.io/projected/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-kube-api-access-sbv6s\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.170242 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc48b70c-5ab9-4765-a8cd-5985a3f63854-logs\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.170474 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9be5e1ee-a214-46ca-a5bf-d1d337848085-logs\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.170914 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-785b79c884-tswfl"] Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.172945 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.177022 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9be5e1ee-a214-46ca-a5bf-d1d337848085-config-data\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.177283 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.187694 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9be5e1ee-a214-46ca-a5bf-d1d337848085-config-data-custom\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.188186 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc48b70c-5ab9-4765-a8cd-5985a3f63854-config-data-custom\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.191749 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt6c6\" (UniqueName: \"kubernetes.io/projected/fc48b70c-5ab9-4765-a8cd-5985a3f63854-kube-api-access-nt6c6\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.192466 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-785b79c884-tswfl"] Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.192520 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be5e1ee-a214-46ca-a5bf-d1d337848085-combined-ca-bundle\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.196561 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc48b70c-5ab9-4765-a8cd-5985a3f63854-config-data\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.202955 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc48b70c-5ab9-4765-a8cd-5985a3f63854-combined-ca-bundle\") pod \"barbican-worker-596d5556df-fx4q8\" (UID: \"fc48b70c-5ab9-4765-a8cd-5985a3f63854\") " pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.219795 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlqvf\" (UniqueName: \"kubernetes.io/projected/9be5e1ee-a214-46ca-a5bf-d1d337848085-kube-api-access-rlqvf\") pod \"barbican-keystone-listener-667444df98-tdgw9\" (UID: \"9be5e1ee-a214-46ca-a5bf-d1d337848085\") " pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.233284 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.271570 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.271629 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-swift-storage-0\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.271654 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-sb\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.271673 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d2df48a-78aa-4711-a0ac-268542093658-logs\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.271695 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-config\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.271710 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-nb\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.271728 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data-custom\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.271745 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-combined-ca-bundle\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.271776 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlb7v\" (UniqueName: \"kubernetes.io/projected/0d2df48a-78aa-4711-a0ac-268542093658-kube-api-access-mlb7v\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.271841 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbv6s\" (UniqueName: \"kubernetes.io/projected/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-kube-api-access-sbv6s\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.271907 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-svc\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.272682 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-svc\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.273585 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-nb\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.274095 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-sb\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.274398 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-swift-storage-0\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.274521 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-config\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.288610 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbv6s\" (UniqueName: \"kubernetes.io/projected/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-kube-api-access-sbv6s\") pod \"dnsmasq-dns-55dcd76767-7nrrt\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.351501 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667444df98-tdgw9" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.371467 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-596d5556df-fx4q8" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.376560 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.376603 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d2df48a-78aa-4711-a0ac-268542093658-logs\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.376628 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data-custom\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.376646 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-combined-ca-bundle\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.376676 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlb7v\" (UniqueName: \"kubernetes.io/projected/0d2df48a-78aa-4711-a0ac-268542093658-kube-api-access-mlb7v\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.377928 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d2df48a-78aa-4711-a0ac-268542093658-logs\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.380167 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.382838 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data-custom\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.382863 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-combined-ca-bundle\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.410696 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlb7v\" (UniqueName: \"kubernetes.io/projected/0d2df48a-78aa-4711-a0ac-268542093658-kube-api-access-mlb7v\") pod \"barbican-api-785b79c884-tswfl\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.414587 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.525119 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.736642 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.736996 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.745999 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.746047 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.799505 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6696d67b98-wrvnm" event={"ID":"c5d4dde9-793b-403e-8701-84cca6a509e1","Type":"ContainerStarted","Data":"9dd376fab0cae7be53b1b672a4896b860045853ce1d0dafe4c84f50b5a3f2b10"} Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.817364 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.828869 4873 generic.go:334] "Generic (PLEG): container finished" podID="99868e3f-82d7-4f0c-9056-661e95486e6e" containerID="22b91ea45d57e3f8ed16da3e5a4058c15af39a6d914075c4521ba6755b03990b" exitCode=0 Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.829784 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vf762" event={"ID":"99868e3f-82d7-4f0c-9056-661e95486e6e","Type":"ContainerDied","Data":"22b91ea45d57e3f8ed16da3e5a4058c15af39a6d914075c4521ba6755b03990b"} Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.843420 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.845590 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-87df9b646-2jf26" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.885719 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.890300 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.934320 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Feb 19 10:03:00 crc kubenswrapper[4873]: I0219 10:03:00.951821 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6687d9896d-v96j2" podUID="fa527f64-6e38-48c2-9927-a319f4579070" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.146700 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-667444df98-tdgw9"] Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.270240 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-596d5556df-fx4q8"] Feb 19 10:03:01 crc kubenswrapper[4873]: W0219 10:03:01.359360 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc48b70c_5ab9_4765_a8cd_5985a3f63854.slice/crio-5da214c6b3840e2d0ccfbd4713138b3b333fc0c9f0e6b77ec488c97edfdd49c3 WatchSource:0}: Error finding container 5da214c6b3840e2d0ccfbd4713138b3b333fc0c9f0e6b77ec488c97edfdd49c3: Status 404 returned error can't find the container with id 5da214c6b3840e2d0ccfbd4713138b3b333fc0c9f0e6b77ec488c97edfdd49c3 Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.368194 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-785b79c884-tswfl"] Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.380196 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55dcd76767-7nrrt"] Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.513235 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de2aeb33-bbce-4b15-a2c4-ce80764ef0c2" path="/var/lib/kubelet/pods/de2aeb33-bbce-4b15-a2c4-ce80764ef0c2/volumes" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.852273 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-785b79c884-tswfl" event={"ID":"0d2df48a-78aa-4711-a0ac-268542093658","Type":"ContainerStarted","Data":"01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.852576 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-785b79c884-tswfl" event={"ID":"0d2df48a-78aa-4711-a0ac-268542093658","Type":"ContainerStarted","Data":"aa044ff7142e0c26ee94862c0e4c5ca488a9ed1c7a1ffa3af69735d62ea70cbd"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.855951 4873 generic.go:334] "Generic (PLEG): container finished" podID="a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" containerID="a00affcc69d0a0f0f0948ccec9e176ec543a48a936aa44d66436705028401e67" exitCode=0 Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.856014 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" event={"ID":"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9","Type":"ContainerDied","Data":"a00affcc69d0a0f0f0948ccec9e176ec543a48a936aa44d66436705028401e67"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.856039 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" event={"ID":"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9","Type":"ContainerStarted","Data":"b5551cf30b386908e2cde5ca7747852cdfecf30b5d0e1c2e9424decee5253a25"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.862006 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667444df98-tdgw9" event={"ID":"9be5e1ee-a214-46ca-a5bf-d1d337848085","Type":"ContainerStarted","Data":"fce4e5e0d318f754e2493e1c3317e9d3d11c50fd765ae2ce3c5cc6f56be7b7c2"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.869659 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9fb835f9-7ac4-4212-a372-b793c2fb8afd","Type":"ContainerStarted","Data":"6ebdf78bcbfc7429f77cd01f756b2ee54279512ab9259f219782422b0537ad75"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.869709 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9fb835f9-7ac4-4212-a372-b793c2fb8afd","Type":"ContainerStarted","Data":"048a19eaf4c1fb549e71b49a7d0f9f321fa31fb431a4095960c4f28e580369d4"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.869721 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9fb835f9-7ac4-4212-a372-b793c2fb8afd","Type":"ContainerStarted","Data":"dd06cf8fd74630da92b4e069bb8f9c8fd98bc8b40165ddd7c9e643a5377ce5e1"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.870227 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.871448 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9fb835f9-7ac4-4212-a372-b793c2fb8afd" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.174:9322/\": dial tcp 10.217.0.174:9322: connect: connection refused" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.879726 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-596d5556df-fx4q8" event={"ID":"fc48b70c-5ab9-4765-a8cd-5985a3f63854","Type":"ContainerStarted","Data":"5da214c6b3840e2d0ccfbd4713138b3b333fc0c9f0e6b77ec488c97edfdd49c3"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.885483 4873 generic.go:334] "Generic (PLEG): container finished" podID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerID="a62f2b1b9a301f2ebebd7bf5613870ec1c3fc6e4830ec22341716d60b02e765d" exitCode=1 Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.885543 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"ab7f7779-d6dd-4844-8af5-83ade972d9d0","Type":"ContainerDied","Data":"a62f2b1b9a301f2ebebd7bf5613870ec1c3fc6e4830ec22341716d60b02e765d"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.886185 4873 scope.go:117] "RemoveContainer" containerID="a62f2b1b9a301f2ebebd7bf5613870ec1c3fc6e4830ec22341716d60b02e765d" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.895659 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gqrb5" event={"ID":"ce5accb4-1da0-4a21-a289-7dba33ad935f","Type":"ContainerStarted","Data":"2874d7c078f6aebe4e7f936700ecedd6916a0afc5a2e7ddcc365abe01b70926a"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.925848 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6696d67b98-wrvnm" event={"ID":"c5d4dde9-793b-403e-8701-84cca6a509e1","Type":"ContainerStarted","Data":"e5601faf67f3908d22845acd48383e8297ed5e25db35806c3ec713fb4493b2b6"} Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.925919 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.925938 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.925952 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.925966 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.926328 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.926632 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.958037 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.958018487 podStartE2EDuration="2.958018487s" podCreationTimestamp="2026-02-19 10:02:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:01.909924924 +0000 UTC m=+1091.199356552" watchObservedRunningTime="2026-02-19 10:03:01.958018487 +0000 UTC m=+1091.247450125" Feb 19 10:03:01 crc kubenswrapper[4873]: I0219 10:03:01.979599 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-gqrb5" podStartSLOduration=5.057272507 podStartE2EDuration="51.979573202s" podCreationTimestamp="2026-02-19 10:02:10 +0000 UTC" firstStartedPulling="2026-02-19 10:02:12.727531649 +0000 UTC m=+1042.016963287" lastFinishedPulling="2026-02-19 10:02:59.649832344 +0000 UTC m=+1088.939263982" observedRunningTime="2026-02-19 10:03:01.929306695 +0000 UTC m=+1091.218738353" watchObservedRunningTime="2026-02-19 10:03:01.979573202 +0000 UTC m=+1091.269004840" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.008735 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6696d67b98-wrvnm" podStartSLOduration=4.008715535 podStartE2EDuration="4.008715535s" podCreationTimestamp="2026-02-19 10:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:01.96983478 +0000 UTC m=+1091.259266428" watchObservedRunningTime="2026-02-19 10:03:02.008715535 +0000 UTC m=+1091.298147173" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.563623 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vf762" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.647783 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-config\") pod \"99868e3f-82d7-4f0c-9056-661e95486e6e\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.647923 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48k6w\" (UniqueName: \"kubernetes.io/projected/99868e3f-82d7-4f0c-9056-661e95486e6e-kube-api-access-48k6w\") pod \"99868e3f-82d7-4f0c-9056-661e95486e6e\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.647961 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-combined-ca-bundle\") pod \"99868e3f-82d7-4f0c-9056-661e95486e6e\" (UID: \"99868e3f-82d7-4f0c-9056-661e95486e6e\") " Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.695745 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99868e3f-82d7-4f0c-9056-661e95486e6e-kube-api-access-48k6w" (OuterVolumeSpecName: "kube-api-access-48k6w") pod "99868e3f-82d7-4f0c-9056-661e95486e6e" (UID: "99868e3f-82d7-4f0c-9056-661e95486e6e"). InnerVolumeSpecName "kube-api-access-48k6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.746326 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-config" (OuterVolumeSpecName: "config") pod "99868e3f-82d7-4f0c-9056-661e95486e6e" (UID: "99868e3f-82d7-4f0c-9056-661e95486e6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.751516 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.751548 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48k6w\" (UniqueName: \"kubernetes.io/projected/99868e3f-82d7-4f0c-9056-661e95486e6e-kube-api-access-48k6w\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.775295 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99868e3f-82d7-4f0c-9056-661e95486e6e" (UID: "99868e3f-82d7-4f0c-9056-661e95486e6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.853309 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99868e3f-82d7-4f0c-9056-661e95486e6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.940050 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-785b79c884-tswfl" event={"ID":"0d2df48a-78aa-4711-a0ac-268542093658","Type":"ContainerStarted","Data":"626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18"} Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.940321 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.940339 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.943177 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" event={"ID":"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9","Type":"ContainerStarted","Data":"69b8a44f78afd834bc213e814516e741948268a9c9eaf6506fb2962a05c91488"} Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.943300 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.945763 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vf762" event={"ID":"99868e3f-82d7-4f0c-9056-661e95486e6e","Type":"ContainerDied","Data":"51130049c4f72c45b52b368bcf10130af9e763c98f2e5fc842a0ae20064148f7"} Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.945784 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vf762" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.945794 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51130049c4f72c45b52b368bcf10130af9e763c98f2e5fc842a0ae20064148f7" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.956827 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-785b79c884-tswfl" podStartSLOduration=2.956810307 podStartE2EDuration="2.956810307s" podCreationTimestamp="2026-02-19 10:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:02.955548765 +0000 UTC m=+1092.244980403" watchObservedRunningTime="2026-02-19 10:03:02.956810307 +0000 UTC m=+1092.246241945" Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.957358 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"ab7f7779-d6dd-4844-8af5-83ade972d9d0","Type":"ContainerStarted","Data":"5d8a6efa61f7a8c09f644a4fe742859469cc638e71ee46cb830c7a9e3cf72be0"} Feb 19 10:03:02 crc kubenswrapper[4873]: I0219 10:03:02.989827 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" podStartSLOduration=3.9898022749999997 podStartE2EDuration="3.989802275s" podCreationTimestamp="2026-02-19 10:02:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:02.976333011 +0000 UTC m=+1092.265764649" watchObservedRunningTime="2026-02-19 10:03:02.989802275 +0000 UTC m=+1092.279233913" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.118767 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.236765 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55dcd76767-7nrrt"] Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.329470 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-697b559f65-2zvb5"] Feb 19 10:03:03 crc kubenswrapper[4873]: E0219 10:03:03.329881 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99868e3f-82d7-4f0c-9056-661e95486e6e" containerName="neutron-db-sync" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.329892 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="99868e3f-82d7-4f0c-9056-661e95486e6e" containerName="neutron-db-sync" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.330067 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="99868e3f-82d7-4f0c-9056-661e95486e6e" containerName="neutron-db-sync" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.331048 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.338721 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-697b559f65-2zvb5"] Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.450186 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-749b6895f6-pmvtl"] Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.451713 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.461439 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-c4d59d6dd-4nh9w"] Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.463074 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: W0219 10:03:03.469136 4873 reflector.go:561] object-"openstack"/"cert-barbican-internal-svc": failed to list *v1.Secret: secrets "cert-barbican-internal-svc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 19 10:03:03 crc kubenswrapper[4873]: E0219 10:03:03.469181 4873 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cert-barbican-internal-svc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-barbican-internal-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 19 10:03:03 crc kubenswrapper[4873]: W0219 10:03:03.471135 4873 reflector.go:561] object-"openstack"/"cert-barbican-public-svc": failed to list *v1.Secret: secrets "cert-barbican-public-svc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 19 10:03:03 crc kubenswrapper[4873]: E0219 10:03:03.471191 4873 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cert-barbican-public-svc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-barbican-public-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.473547 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pk4jm" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.473778 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.473906 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.474016 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.480401 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-swift-storage-0\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.480462 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sccds\" (UniqueName: \"kubernetes.io/projected/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-kube-api-access-sccds\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.480607 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-nb\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.480696 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-config\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.480723 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-sb\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.480822 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-svc\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.546059 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-749b6895f6-pmvtl"] Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586579 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sp8t\" (UniqueName: \"kubernetes.io/projected/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-kube-api-access-7sp8t\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586639 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-ovndb-tls-certs\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586658 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-config-data\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586680 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-config\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586696 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-sb\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586717 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-logs\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586730 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-combined-ca-bundle\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586784 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-config\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586801 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-svc\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586826 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-config-data-custom\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586868 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-swift-storage-0\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586889 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sccds\" (UniqueName: \"kubernetes.io/projected/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-kube-api-access-sccds\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586918 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-httpd-config\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586939 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-internal-tls-certs\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.586953 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkk7t\" (UniqueName: \"kubernetes.io/projected/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-kube-api-access-zkk7t\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.587022 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-nb\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.587042 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-public-tls-certs\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.587061 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-combined-ca-bundle\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.587857 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-config\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.587976 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-swift-storage-0\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.588716 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-sb\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.589070 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-svc\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.589731 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-nb\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.590480 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c4d59d6dd-4nh9w"] Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.673069 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sccds\" (UniqueName: \"kubernetes.io/projected/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-kube-api-access-sccds\") pod \"dnsmasq-dns-697b559f65-2zvb5\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.679923 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.689350 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-config-data-custom\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.689608 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-httpd-config\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.689713 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-internal-tls-certs\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.689813 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkk7t\" (UniqueName: \"kubernetes.io/projected/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-kube-api-access-zkk7t\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.689982 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-public-tls-certs\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.690063 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-combined-ca-bundle\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.690164 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sp8t\" (UniqueName: \"kubernetes.io/projected/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-kube-api-access-7sp8t\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.690268 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-ovndb-tls-certs\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.690339 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-config-data\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.690421 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-logs\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.690492 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-combined-ca-bundle\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.690586 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-config\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.693596 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-logs\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.700575 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-httpd-config\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.704784 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-config-data-custom\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.705328 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-ovndb-tls-certs\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.705876 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-combined-ca-bundle\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.706718 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-config\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.725569 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-config-data\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.736846 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkk7t\" (UniqueName: \"kubernetes.io/projected/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-kube-api-access-zkk7t\") pod \"neutron-749b6895f6-pmvtl\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.738843 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sp8t\" (UniqueName: \"kubernetes.io/projected/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-kube-api-access-7sp8t\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.750333 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-combined-ca-bundle\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:03 crc kubenswrapper[4873]: I0219 10:03:03.802915 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:04 crc kubenswrapper[4873]: E0219 10:03:04.691186 4873 secret.go:188] Couldn't get secret openstack/cert-barbican-public-svc: failed to sync secret cache: timed out waiting for the condition Feb 19 10:03:04 crc kubenswrapper[4873]: E0219 10:03:04.691529 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-public-tls-certs podName:76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3 nodeName:}" failed. No retries permitted until 2026-02-19 10:03:05.191510942 +0000 UTC m=+1094.480942580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-public-tls-certs") pod "barbican-api-c4d59d6dd-4nh9w" (UID: "76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3") : failed to sync secret cache: timed out waiting for the condition Feb 19 10:03:04 crc kubenswrapper[4873]: E0219 10:03:04.691765 4873 secret.go:188] Couldn't get secret openstack/cert-barbican-internal-svc: failed to sync secret cache: timed out waiting for the condition Feb 19 10:03:04 crc kubenswrapper[4873]: E0219 10:03:04.691789 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-internal-tls-certs podName:76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3 nodeName:}" failed. No retries permitted until 2026-02-19 10:03:05.191782909 +0000 UTC m=+1094.481214537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-internal-tls-certs") pod "barbican-api-c4d59d6dd-4nh9w" (UID: "76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3") : failed to sync secret cache: timed out waiting for the condition Feb 19 10:03:04 crc kubenswrapper[4873]: I0219 10:03:04.744471 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 19 10:03:04 crc kubenswrapper[4873]: I0219 10:03:04.790931 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 19 10:03:04 crc kubenswrapper[4873]: I0219 10:03:04.989439 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" podUID="a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" containerName="dnsmasq-dns" containerID="cri-o://69b8a44f78afd834bc213e814516e741948268a9c9eaf6506fb2962a05c91488" gracePeriod=10 Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.146878 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.147231 4873 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.179253 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.179365 4873 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.233210 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-internal-tls-certs\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.233324 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-public-tls-certs\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.234160 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.238628 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-public-tls-certs\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.245692 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3-internal-tls-certs\") pod \"barbican-api-c4d59d6dd-4nh9w\" (UID: \"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3\") " pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.317213 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.481249 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 10:03:05 crc kubenswrapper[4873]: I0219 10:03:05.879461 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.018066 4873 generic.go:334] "Generic (PLEG): container finished" podID="a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" containerID="69b8a44f78afd834bc213e814516e741948268a9c9eaf6506fb2962a05c91488" exitCode=0 Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.019251 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" event={"ID":"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9","Type":"ContainerDied","Data":"69b8a44f78afd834bc213e814516e741948268a9c9eaf6506fb2962a05c91488"} Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.315065 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76cc4fb9fc-vdfd4"] Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.317630 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.329501 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.329720 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.341759 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76cc4fb9fc-vdfd4"] Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.359609 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-public-tls-certs\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.359660 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-ovndb-tls-certs\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.359732 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-internal-tls-certs\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.359797 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-combined-ca-bundle\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.359846 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-httpd-config\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.359873 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-config\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.359917 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2r2n\" (UniqueName: \"kubernetes.io/projected/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-kube-api-access-x2r2n\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.461525 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-public-tls-certs\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.461900 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-ovndb-tls-certs\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.461978 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-internal-tls-certs\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.462043 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-combined-ca-bundle\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.462089 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-httpd-config\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.462138 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-config\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.462187 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2r2n\" (UniqueName: \"kubernetes.io/projected/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-kube-api-access-x2r2n\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.469205 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-config\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.473888 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-public-tls-certs\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.473882 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-internal-tls-certs\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.473945 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-ovndb-tls-certs\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.525753 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-httpd-config\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.525840 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-combined-ca-bundle\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.530807 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2r2n\" (UniqueName: \"kubernetes.io/projected/f168d086-aaa7-4a6e-9a65-5ab28e10a7e8-kube-api-access-x2r2n\") pod \"neutron-76cc4fb9fc-vdfd4\" (UID: \"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8\") " pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.615720 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 10:03:06 crc kubenswrapper[4873]: I0219 10:03:06.677241 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:07 crc kubenswrapper[4873]: I0219 10:03:07.031020 4873 generic.go:334] "Generic (PLEG): container finished" podID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerID="5d8a6efa61f7a8c09f644a4fe742859469cc638e71ee46cb830c7a9e3cf72be0" exitCode=1 Feb 19 10:03:07 crc kubenswrapper[4873]: I0219 10:03:07.031058 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"ab7f7779-d6dd-4844-8af5-83ade972d9d0","Type":"ContainerDied","Data":"5d8a6efa61f7a8c09f644a4fe742859469cc638e71ee46cb830c7a9e3cf72be0"} Feb 19 10:03:07 crc kubenswrapper[4873]: I0219 10:03:07.031089 4873 scope.go:117] "RemoveContainer" containerID="a62f2b1b9a301f2ebebd7bf5613870ec1c3fc6e4830ec22341716d60b02e765d" Feb 19 10:03:07 crc kubenswrapper[4873]: I0219 10:03:07.031674 4873 scope.go:117] "RemoveContainer" containerID="5d8a6efa61f7a8c09f644a4fe742859469cc638e71ee46cb830c7a9e3cf72be0" Feb 19 10:03:07 crc kubenswrapper[4873]: E0219 10:03:07.031862 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(ab7f7779-d6dd-4844-8af5-83ade972d9d0)\"" pod="openstack/watcher-decision-engine-0" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" Feb 19 10:03:07 crc kubenswrapper[4873]: I0219 10:03:07.135674 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:07 crc kubenswrapper[4873]: I0219 10:03:07.135730 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:08 crc kubenswrapper[4873]: I0219 10:03:08.040266 4873 scope.go:117] "RemoveContainer" containerID="5d8a6efa61f7a8c09f644a4fe742859469cc638e71ee46cb830c7a9e3cf72be0" Feb 19 10:03:08 crc kubenswrapper[4873]: E0219 10:03:08.040745 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(ab7f7779-d6dd-4844-8af5-83ade972d9d0)\"" pod="openstack/watcher-decision-engine-0" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" Feb 19 10:03:08 crc kubenswrapper[4873]: I0219 10:03:08.119148 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Feb 19 10:03:08 crc kubenswrapper[4873]: I0219 10:03:08.145210 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Feb 19 10:03:09 crc kubenswrapper[4873]: I0219 10:03:09.083009 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Feb 19 10:03:10 crc kubenswrapper[4873]: I0219 10:03:10.065719 4873 generic.go:334] "Generic (PLEG): container finished" podID="ce5accb4-1da0-4a21-a289-7dba33ad935f" containerID="2874d7c078f6aebe4e7f936700ecedd6916a0afc5a2e7ddcc365abe01b70926a" exitCode=0 Feb 19 10:03:10 crc kubenswrapper[4873]: I0219 10:03:10.066671 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gqrb5" event={"ID":"ce5accb4-1da0-4a21-a289-7dba33ad935f","Type":"ContainerDied","Data":"2874d7c078f6aebe4e7f936700ecedd6916a0afc5a2e7ddcc365abe01b70926a"} Feb 19 10:03:10 crc kubenswrapper[4873]: I0219 10:03:10.234451 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Feb 19 10:03:10 crc kubenswrapper[4873]: I0219 10:03:10.246901 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Feb 19 10:03:10 crc kubenswrapper[4873]: I0219 10:03:10.900672 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:10 crc kubenswrapper[4873]: I0219 10:03:10.965796 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-svc\") pod \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " Feb 19 10:03:10 crc kubenswrapper[4873]: I0219 10:03:10.965870 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbv6s\" (UniqueName: \"kubernetes.io/projected/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-kube-api-access-sbv6s\") pod \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " Feb 19 10:03:10 crc kubenswrapper[4873]: I0219 10:03:10.965971 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-config\") pod \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " Feb 19 10:03:10 crc kubenswrapper[4873]: I0219 10:03:10.966131 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-sb\") pod \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " Feb 19 10:03:10 crc kubenswrapper[4873]: I0219 10:03:10.966161 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-nb\") pod \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " Feb 19 10:03:10 crc kubenswrapper[4873]: I0219 10:03:10.966240 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-swift-storage-0\") pod \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\" (UID: \"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9\") " Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.000976 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-kube-api-access-sbv6s" (OuterVolumeSpecName: "kube-api-access-sbv6s") pod "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" (UID: "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9"). InnerVolumeSpecName "kube-api-access-sbv6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.034511 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" (UID: "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.060613 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" (UID: "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.068383 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.068437 4873 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.068450 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbv6s\" (UniqueName: \"kubernetes.io/projected/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-kube-api-access-sbv6s\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.093131 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" (UID: "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.122933 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" event={"ID":"a0e5418f-aef3-4aba-ba9b-f4e515fb18c9","Type":"ContainerDied","Data":"b5551cf30b386908e2cde5ca7747852cdfecf30b5d0e1c2e9424decee5253a25"} Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.122999 4873 scope.go:117] "RemoveContainer" containerID="69b8a44f78afd834bc213e814516e741948268a9c9eaf6506fb2962a05c91488" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.123243 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.146615 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.147712 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-config" (OuterVolumeSpecName: "config") pod "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" (UID: "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.155342 4873 scope.go:117] "RemoveContainer" containerID="a00affcc69d0a0f0f0948ccec9e176ec543a48a936aa44d66436705028401e67" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.177006 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.177022 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.220309 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" (UID: "a0e5418f-aef3-4aba-ba9b-f4e515fb18c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.281994 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:11 crc kubenswrapper[4873]: E0219 10:03:11.300418 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.376764 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-749b6895f6-pmvtl"] Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.477906 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55dcd76767-7nrrt"] Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.529038 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55dcd76767-7nrrt"] Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.529087 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-697b559f65-2zvb5"] Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.639740 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76cc4fb9fc-vdfd4"] Feb 19 10:03:11 crc kubenswrapper[4873]: I0219 10:03:11.717925 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c4d59d6dd-4nh9w"] Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.156773 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667444df98-tdgw9" event={"ID":"9be5e1ee-a214-46ca-a5bf-d1d337848085","Type":"ContainerStarted","Data":"c6341001b5e009d25f87fdac6fd541731999dddfee636490d004349659a0895c"} Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.156826 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667444df98-tdgw9" event={"ID":"9be5e1ee-a214-46ca-a5bf-d1d337848085","Type":"ContainerStarted","Data":"cf7d70a01a1d05e9c05ea5d0d7ba3fe3e00df168e2fb9b29aef913e596b5166e"} Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.162218 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" event={"ID":"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8","Type":"ContainerStarted","Data":"1b2761c8825453002d934e520c1c7afafdbc5067bc5158f1412768c1d6a606c1"} Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.179961 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cc4fb9fc-vdfd4" event={"ID":"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8","Type":"ContainerStarted","Data":"851d40a64ed5633a008606bc0f8f8d6dd768aa087b79b57994c68e881d2af84b"} Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.187485 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-596d5556df-fx4q8" event={"ID":"fc48b70c-5ab9-4765-a8cd-5985a3f63854","Type":"ContainerStarted","Data":"f853980fd3085a7656b173299c504a1ba631f4a032f6c9b41a6889decade7904"} Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.187528 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-596d5556df-fx4q8" event={"ID":"fc48b70c-5ab9-4765-a8cd-5985a3f63854","Type":"ContainerStarted","Data":"4b5d0d56217e2986147a41ce27c642e2f8b06114c0d3a3036bab7c5b0cc976e1"} Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.199328 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-667444df98-tdgw9" podStartSLOduration=3.57806258 podStartE2EDuration="13.199298393s" podCreationTimestamp="2026-02-19 10:02:59 +0000 UTC" firstStartedPulling="2026-02-19 10:03:01.173856583 +0000 UTC m=+1090.463288221" lastFinishedPulling="2026-02-19 10:03:10.795092396 +0000 UTC m=+1100.084524034" observedRunningTime="2026-02-19 10:03:12.180414084 +0000 UTC m=+1101.469845722" watchObservedRunningTime="2026-02-19 10:03:12.199298393 +0000 UTC m=+1101.488730031" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.209782 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gqrb5" event={"ID":"ce5accb4-1da0-4a21-a289-7dba33ad935f","Type":"ContainerDied","Data":"7091928b68df42cd9ae5c284cfdb9622dc758710a4af850abe1bece12bfc74a3"} Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.209834 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7091928b68df42cd9ae5c284cfdb9622dc758710a4af850abe1bece12bfc74a3" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.221222 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.225777 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-596d5556df-fx4q8" podStartSLOduration=3.850481668 podStartE2EDuration="13.225757859s" podCreationTimestamp="2026-02-19 10:02:59 +0000 UTC" firstStartedPulling="2026-02-19 10:03:01.403625953 +0000 UTC m=+1090.693057591" lastFinishedPulling="2026-02-19 10:03:10.778902144 +0000 UTC m=+1100.068333782" observedRunningTime="2026-02-19 10:03:12.21528926 +0000 UTC m=+1101.504720898" watchObservedRunningTime="2026-02-19 10:03:12.225757859 +0000 UTC m=+1101.515189497" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.228054 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749b6895f6-pmvtl" event={"ID":"3e2e96b4-be71-4257-a1ed-0c7427ed0e64","Type":"ContainerStarted","Data":"c2fc7030796f36afab7f9dbbf523310f26e22db5ef1e487e914036c0ac971b2a"} Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.228095 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749b6895f6-pmvtl" event={"ID":"3e2e96b4-be71-4257-a1ed-0c7427ed0e64","Type":"ContainerStarted","Data":"936d413c07d2e70cda379bc1d9e56c3d69a0e75d48e2a897c8fb38cdf7c08e5e"} Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.229661 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c4d59d6dd-4nh9w" event={"ID":"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3","Type":"ContainerStarted","Data":"0a2ecaf652ead9d88e28513a7ab548a70faada0feffc370d3043bb7d333bce02"} Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.248373 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab448dfd-a67c-49b5-a153-92a5a6f504b2","Type":"ContainerStarted","Data":"bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10"} Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.248465 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="ceilometer-notification-agent" containerID="cri-o://ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e" gracePeriod=30 Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.248649 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="proxy-httpd" containerID="cri-o://bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10" gracePeriod=30 Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.248709 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="sg-core" containerID="cri-o://c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1" gracePeriod=30 Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.258715 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.305925 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-config-data\") pod \"ce5accb4-1da0-4a21-a289-7dba33ad935f\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.305999 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-scripts\") pod \"ce5accb4-1da0-4a21-a289-7dba33ad935f\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.306724 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7ttv\" (UniqueName: \"kubernetes.io/projected/ce5accb4-1da0-4a21-a289-7dba33ad935f-kube-api-access-k7ttv\") pod \"ce5accb4-1da0-4a21-a289-7dba33ad935f\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.306810 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce5accb4-1da0-4a21-a289-7dba33ad935f-etc-machine-id\") pod \"ce5accb4-1da0-4a21-a289-7dba33ad935f\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.306887 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce5accb4-1da0-4a21-a289-7dba33ad935f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ce5accb4-1da0-4a21-a289-7dba33ad935f" (UID: "ce5accb4-1da0-4a21-a289-7dba33ad935f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.306997 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-combined-ca-bundle\") pod \"ce5accb4-1da0-4a21-a289-7dba33ad935f\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.307163 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-db-sync-config-data\") pod \"ce5accb4-1da0-4a21-a289-7dba33ad935f\" (UID: \"ce5accb4-1da0-4a21-a289-7dba33ad935f\") " Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.308274 4873 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce5accb4-1da0-4a21-a289-7dba33ad935f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.313230 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ce5accb4-1da0-4a21-a289-7dba33ad935f" (UID: "ce5accb4-1da0-4a21-a289-7dba33ad935f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.313848 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce5accb4-1da0-4a21-a289-7dba33ad935f-kube-api-access-k7ttv" (OuterVolumeSpecName: "kube-api-access-k7ttv") pod "ce5accb4-1da0-4a21-a289-7dba33ad935f" (UID: "ce5accb4-1da0-4a21-a289-7dba33ad935f"). InnerVolumeSpecName "kube-api-access-k7ttv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.317131 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-scripts" (OuterVolumeSpecName: "scripts") pod "ce5accb4-1da0-4a21-a289-7dba33ad935f" (UID: "ce5accb4-1da0-4a21-a289-7dba33ad935f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.358271 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce5accb4-1da0-4a21-a289-7dba33ad935f" (UID: "ce5accb4-1da0-4a21-a289-7dba33ad935f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.411018 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.411056 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7ttv\" (UniqueName: \"kubernetes.io/projected/ce5accb4-1da0-4a21-a289-7dba33ad935f-kube-api-access-k7ttv\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.411069 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.411081 4873 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.501802 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-config-data" (OuterVolumeSpecName: "config-data") pod "ce5accb4-1da0-4a21-a289-7dba33ad935f" (UID: "ce5accb4-1da0-4a21-a289-7dba33ad935f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.513002 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce5accb4-1da0-4a21-a289-7dba33ad935f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.760422 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:12 crc kubenswrapper[4873]: I0219 10:03:12.955657 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.275487 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749b6895f6-pmvtl" event={"ID":"3e2e96b4-be71-4257-a1ed-0c7427ed0e64","Type":"ContainerStarted","Data":"a79fc5280241e2b1290e7c529d75e24bfc2e34924a4a60f5635bd08c8d066317"} Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.276209 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.277708 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c4d59d6dd-4nh9w" event={"ID":"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3","Type":"ContainerStarted","Data":"961603804fece1276cbfe9325147ce80b3818f8bf8b93dd68ae6303709236cd6"} Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.277734 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c4d59d6dd-4nh9w" event={"ID":"76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3","Type":"ContainerStarted","Data":"894709ffcd09bfb9a507c96057667b335575a46f4c6b295398aad79b687e8c39"} Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.278133 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.278157 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.288239 4873 generic.go:334] "Generic (PLEG): container finished" podID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerID="bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10" exitCode=0 Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.288266 4873 generic.go:334] "Generic (PLEG): container finished" podID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerID="c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1" exitCode=2 Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.288306 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab448dfd-a67c-49b5-a153-92a5a6f504b2","Type":"ContainerDied","Data":"bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10"} Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.288330 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab448dfd-a67c-49b5-a153-92a5a6f504b2","Type":"ContainerDied","Data":"c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1"} Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.290295 4873 generic.go:334] "Generic (PLEG): container finished" podID="6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" containerID="2e532231fb9e05c5e12daaa7682ad61edb1cc2ae1641329effcd1ef595a0472f" exitCode=0 Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.290329 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" event={"ID":"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8","Type":"ContainerDied","Data":"2e532231fb9e05c5e12daaa7682ad61edb1cc2ae1641329effcd1ef595a0472f"} Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.295543 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gqrb5" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.296899 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cc4fb9fc-vdfd4" event={"ID":"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8","Type":"ContainerStarted","Data":"03af86cb9a03a5748c5d591d8dc080c065aa41860d02b9f72a77071d3d585291"} Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.296936 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.296946 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cc4fb9fc-vdfd4" event={"ID":"f168d086-aaa7-4a6e-9a65-5ab28e10a7e8","Type":"ContainerStarted","Data":"0c53089e0bb793cea82cf9543a9eb6219fa6e73cb39ca5491d0613d4928c1fa5"} Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.333378 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-749b6895f6-pmvtl" podStartSLOduration=10.333364314 podStartE2EDuration="10.333364314s" podCreationTimestamp="2026-02-19 10:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:13.300211555 +0000 UTC m=+1102.589643193" watchObservedRunningTime="2026-02-19 10:03:13.333364314 +0000 UTC m=+1102.622795952" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.349703 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76cc4fb9fc-vdfd4" podStartSLOduration=7.349686382 podStartE2EDuration="7.349686382s" podCreationTimestamp="2026-02-19 10:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:13.332444121 +0000 UTC m=+1102.621875789" watchObservedRunningTime="2026-02-19 10:03:13.349686382 +0000 UTC m=+1102.639118020" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.439392 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-c4d59d6dd-4nh9w" podStartSLOduration=10.439363685 podStartE2EDuration="10.439363685s" podCreationTimestamp="2026-02-19 10:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:13.391118098 +0000 UTC m=+1102.680549736" watchObservedRunningTime="2026-02-19 10:03:13.439363685 +0000 UTC m=+1102.728795353" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.516036 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" path="/var/lib/kubelet/pods/a0e5418f-aef3-4aba-ba9b-f4e515fb18c9/volumes" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.584374 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 10:03:13 crc kubenswrapper[4873]: E0219 10:03:13.585299 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" containerName="dnsmasq-dns" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.585405 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" containerName="dnsmasq-dns" Feb 19 10:03:13 crc kubenswrapper[4873]: E0219 10:03:13.585518 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5accb4-1da0-4a21-a289-7dba33ad935f" containerName="cinder-db-sync" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.585613 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5accb4-1da0-4a21-a289-7dba33ad935f" containerName="cinder-db-sync" Feb 19 10:03:13 crc kubenswrapper[4873]: E0219 10:03:13.585706 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" containerName="init" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.585773 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" containerName="init" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.586034 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce5accb4-1da0-4a21-a289-7dba33ad935f" containerName="cinder-db-sync" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.586154 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" containerName="dnsmasq-dns" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.587478 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.595017 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.595202 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.595310 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tmcc9" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.595435 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.641936 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.769984 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a649f7b-88cb-4b43-bc71-06ab3237f955-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.770363 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.770624 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.772333 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94s6t\" (UniqueName: \"kubernetes.io/projected/5a649f7b-88cb-4b43-bc71-06ab3237f955-kube-api-access-94s6t\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.772548 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.772677 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.776743 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-697b559f65-2zvb5"] Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.812025 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8656fdbcc7-6lw5c"] Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.879055 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94s6t\" (UniqueName: \"kubernetes.io/projected/5a649f7b-88cb-4b43-bc71-06ab3237f955-kube-api-access-94s6t\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.879149 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.879192 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.879586 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.879625 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a649f7b-88cb-4b43-bc71-06ab3237f955-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.879759 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.889303 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a649f7b-88cb-4b43-bc71-06ab3237f955-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.896862 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.905010 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.910910 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8656fdbcc7-6lw5c"] Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.911046 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.922752 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.940478 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.950808 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94s6t\" (UniqueName: \"kubernetes.io/projected/5a649f7b-88cb-4b43-bc71-06ab3237f955-kube-api-access-94s6t\") pod \"cinder-scheduler-0\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.984005 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-config\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.984074 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-swift-storage-0\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.984162 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-sb\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.984190 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-nb\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.984210 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwhw4\" (UniqueName: \"kubernetes.io/projected/e78542dc-01da-47dc-aec5-a380b7484425-kube-api-access-fwhw4\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:13 crc kubenswrapper[4873]: I0219 10:03:13.984297 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-svc\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.006140 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.008422 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.014589 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.017386 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.054418 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092260 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-swift-storage-0\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092429 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092469 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8786cefd-adc3-4acf-bc04-066bc0510131-logs\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092504 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-sb\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092538 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-nb\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092558 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwhw4\" (UniqueName: \"kubernetes.io/projected/e78542dc-01da-47dc-aec5-a380b7484425-kube-api-access-fwhw4\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092590 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdhrt\" (UniqueName: \"kubernetes.io/projected/8786cefd-adc3-4acf-bc04-066bc0510131-kube-api-access-vdhrt\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092719 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092736 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data-custom\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092762 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-scripts\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092805 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-svc\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.092979 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8786cefd-adc3-4acf-bc04-066bc0510131-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.093013 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-config\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.107051 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-config\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.110057 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-swift-storage-0\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.113705 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-svc\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.114672 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-sb\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.122729 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-nb\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.137948 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwhw4\" (UniqueName: \"kubernetes.io/projected/e78542dc-01da-47dc-aec5-a380b7484425-kube-api-access-fwhw4\") pod \"dnsmasq-dns-8656fdbcc7-6lw5c\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.166145 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.194210 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8786cefd-adc3-4acf-bc04-066bc0510131-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.194307 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.194326 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8786cefd-adc3-4acf-bc04-066bc0510131-logs\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.194349 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdhrt\" (UniqueName: \"kubernetes.io/projected/8786cefd-adc3-4acf-bc04-066bc0510131-kube-api-access-vdhrt\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.194398 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.194414 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data-custom\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.194430 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-scripts\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.195756 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8786cefd-adc3-4acf-bc04-066bc0510131-logs\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.196663 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8786cefd-adc3-4acf-bc04-066bc0510131-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.202215 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.202541 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.203572 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data-custom\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.204177 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-scripts\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.213758 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdhrt\" (UniqueName: \"kubernetes.io/projected/8786cefd-adc3-4acf-bc04-066bc0510131-kube-api-access-vdhrt\") pod \"cinder-api-0\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.240308 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.334314 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" podUID="6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" containerName="dnsmasq-dns" containerID="cri-o://f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20" gracePeriod=10 Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.334613 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" event={"ID":"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8","Type":"ContainerStarted","Data":"f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20"} Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.335720 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.345827 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.367861 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.375976 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" podStartSLOduration=11.375955416 podStartE2EDuration="11.375955416s" podCreationTimestamp="2026-02-19 10:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:14.367218758 +0000 UTC m=+1103.656650406" watchObservedRunningTime="2026-02-19 10:03:14.375955416 +0000 UTC m=+1103.665387044" Feb 19 10:03:14 crc kubenswrapper[4873]: I0219 10:03:14.728970 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 10:03:14 crc kubenswrapper[4873]: W0219 10:03:14.756766 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a649f7b_88cb_4b43_bc71_06ab3237f955.slice/crio-d01b29d0a705eb2d4f63512fcea22d2e1a868ce32ce3cff9e21615cd03cc670a WatchSource:0}: Error finding container d01b29d0a705eb2d4f63512fcea22d2e1a868ce32ce3cff9e21615cd03cc670a: Status 404 returned error can't find the container with id d01b29d0a705eb2d4f63512fcea22d2e1a868ce32ce3cff9e21615cd03cc670a Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.220685 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8656fdbcc7-6lw5c"] Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.238791 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.270861 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.308309 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337179 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-swift-storage-0\") pod \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337234 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-run-httpd\") pod \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337305 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-scripts\") pod \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337336 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-nb\") pod \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337364 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-sg-core-conf-yaml\") pod \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337440 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-sb\") pod \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337518 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-config\") pod \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337546 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-combined-ca-bundle\") pod \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337593 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-config-data\") pod \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337634 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-log-httpd\") pod \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337654 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-svc\") pod \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337704 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sccds\" (UniqueName: \"kubernetes.io/projected/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-kube-api-access-sccds\") pod \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\" (UID: \"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.337722 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xxqg\" (UniqueName: \"kubernetes.io/projected/ab448dfd-a67c-49b5-a153-92a5a6f504b2-kube-api-access-4xxqg\") pod \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\" (UID: \"ab448dfd-a67c-49b5-a153-92a5a6f504b2\") " Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.342775 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ab448dfd-a67c-49b5-a153-92a5a6f504b2" (UID: "ab448dfd-a67c-49b5-a153-92a5a6f504b2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.343073 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ab448dfd-a67c-49b5-a153-92a5a6f504b2" (UID: "ab448dfd-a67c-49b5-a153-92a5a6f504b2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.350430 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-scripts" (OuterVolumeSpecName: "scripts") pod "ab448dfd-a67c-49b5-a153-92a5a6f504b2" (UID: "ab448dfd-a67c-49b5-a153-92a5a6f504b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.372754 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" event={"ID":"e78542dc-01da-47dc-aec5-a380b7484425","Type":"ContainerStarted","Data":"c7ffc8e18883ae90270b9d4c0dcb813698f920dfda3376430f83575ac81ce7b9"} Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.372832 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab448dfd-a67c-49b5-a153-92a5a6f504b2-kube-api-access-4xxqg" (OuterVolumeSpecName: "kube-api-access-4xxqg") pod "ab448dfd-a67c-49b5-a153-92a5a6f504b2" (UID: "ab448dfd-a67c-49b5-a153-92a5a6f504b2"). InnerVolumeSpecName "kube-api-access-4xxqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.372923 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-kube-api-access-sccds" (OuterVolumeSpecName: "kube-api-access-sccds") pod "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" (UID: "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8"). InnerVolumeSpecName "kube-api-access-sccds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.379678 4873 generic.go:334] "Generic (PLEG): container finished" podID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerID="ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e" exitCode=0 Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.379724 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab448dfd-a67c-49b5-a153-92a5a6f504b2","Type":"ContainerDied","Data":"ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e"} Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.379744 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab448dfd-a67c-49b5-a153-92a5a6f504b2","Type":"ContainerDied","Data":"c341b58fa66a9c7c1455f8e33fdfb22dd5f6b0a9b06cdb661264c78977069ea2"} Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.379761 4873 scope.go:117] "RemoveContainer" containerID="bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.379878 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.401900 4873 generic.go:334] "Generic (PLEG): container finished" podID="6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" containerID="f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20" exitCode=0 Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.401966 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.401983 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" event={"ID":"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8","Type":"ContainerDied","Data":"f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20"} Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.402016 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697b559f65-2zvb5" event={"ID":"6dfe8fe2-4637-46d9-b3bc-689e510c6ec8","Type":"ContainerDied","Data":"1b2761c8825453002d934e520c1c7afafdbc5067bc5158f1412768c1d6a606c1"} Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.419035 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55dcd76767-7nrrt" podUID="a0e5418f-aef3-4aba-ba9b-f4e515fb18c9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.177:5353: i/o timeout" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.421717 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8786cefd-adc3-4acf-bc04-066bc0510131","Type":"ContainerStarted","Data":"8e464848b3ebc3175565441b190b29e16936e4f1ed928d10cd26c6f756af71c1"} Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.430090 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" (UID: "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.443436 4873 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.443462 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sccds\" (UniqueName: \"kubernetes.io/projected/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-kube-api-access-sccds\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.443474 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xxqg\" (UniqueName: \"kubernetes.io/projected/ab448dfd-a67c-49b5-a153-92a5a6f504b2-kube-api-access-4xxqg\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.443482 4873 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.443490 4873 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab448dfd-a67c-49b5-a153-92a5a6f504b2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.443500 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.453913 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ab448dfd-a67c-49b5-a153-92a5a6f504b2" (UID: "ab448dfd-a67c-49b5-a153-92a5a6f504b2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.466705 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" (UID: "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.471169 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a649f7b-88cb-4b43-bc71-06ab3237f955","Type":"ContainerStarted","Data":"d01b29d0a705eb2d4f63512fcea22d2e1a868ce32ce3cff9e21615cd03cc670a"} Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.480949 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" (UID: "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.486143 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" (UID: "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.487814 4873 scope.go:117] "RemoveContainer" containerID="c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.498704 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-config" (OuterVolumeSpecName: "config") pod "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" (UID: "6dfe8fe2-4637-46d9-b3bc-689e510c6ec8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.518448 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab448dfd-a67c-49b5-a153-92a5a6f504b2" (UID: "ab448dfd-a67c-49b5-a153-92a5a6f504b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.529864 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-config-data" (OuterVolumeSpecName: "config-data") pod "ab448dfd-a67c-49b5-a153-92a5a6f504b2" (UID: "ab448dfd-a67c-49b5-a153-92a5a6f504b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.530357 4873 scope.go:117] "RemoveContainer" containerID="ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.545377 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.545400 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.545410 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.545419 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.545427 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.545436 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.545443 4873 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab448dfd-a67c-49b5-a153-92a5a6f504b2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.596096 4873 scope.go:117] "RemoveContainer" containerID="bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10" Feb 19 10:03:15 crc kubenswrapper[4873]: E0219 10:03:15.596569 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10\": container with ID starting with bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10 not found: ID does not exist" containerID="bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.596617 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10"} err="failed to get container status \"bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10\": rpc error: code = NotFound desc = could not find container \"bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10\": container with ID starting with bd797254af7d727ccef55cdbb9c374846a2546bce00a686191d6ba6af2767e10 not found: ID does not exist" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.596646 4873 scope.go:117] "RemoveContainer" containerID="c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1" Feb 19 10:03:15 crc kubenswrapper[4873]: E0219 10:03:15.597386 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1\": container with ID starting with c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1 not found: ID does not exist" containerID="c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.597419 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1"} err="failed to get container status \"c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1\": rpc error: code = NotFound desc = could not find container \"c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1\": container with ID starting with c3a492f4bbf32fe7c67e7763c6c5275b1ff0f9bdb61d830125564dd30c92b1f1 not found: ID does not exist" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.597447 4873 scope.go:117] "RemoveContainer" containerID="ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e" Feb 19 10:03:15 crc kubenswrapper[4873]: E0219 10:03:15.597883 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e\": container with ID starting with ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e not found: ID does not exist" containerID="ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.597920 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e"} err="failed to get container status \"ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e\": rpc error: code = NotFound desc = could not find container \"ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e\": container with ID starting with ad6c5766518ab006754a3a4a7d650831c78a948d573045b4e146b11a48e7017e not found: ID does not exist" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.597967 4873 scope.go:117] "RemoveContainer" containerID="f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.642883 4873 scope.go:117] "RemoveContainer" containerID="2e532231fb9e05c5e12daaa7682ad61edb1cc2ae1641329effcd1ef595a0472f" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.824780 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.831077 4873 scope.go:117] "RemoveContainer" containerID="f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20" Feb 19 10:03:15 crc kubenswrapper[4873]: E0219 10:03:15.837663 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20\": container with ID starting with f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20 not found: ID does not exist" containerID="f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.837704 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20"} err="failed to get container status \"f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20\": rpc error: code = NotFound desc = could not find container \"f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20\": container with ID starting with f456921ecb313aa05a7e9f52e2901ff2c55430f10241666557c3053d87edce20 not found: ID does not exist" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.837731 4873 scope.go:117] "RemoveContainer" containerID="2e532231fb9e05c5e12daaa7682ad61edb1cc2ae1641329effcd1ef595a0472f" Feb 19 10:03:15 crc kubenswrapper[4873]: E0219 10:03:15.839222 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e532231fb9e05c5e12daaa7682ad61edb1cc2ae1641329effcd1ef595a0472f\": container with ID starting with 2e532231fb9e05c5e12daaa7682ad61edb1cc2ae1641329effcd1ef595a0472f not found: ID does not exist" containerID="2e532231fb9e05c5e12daaa7682ad61edb1cc2ae1641329effcd1ef595a0472f" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.839256 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e532231fb9e05c5e12daaa7682ad61edb1cc2ae1641329effcd1ef595a0472f"} err="failed to get container status \"2e532231fb9e05c5e12daaa7682ad61edb1cc2ae1641329effcd1ef595a0472f\": rpc error: code = NotFound desc = could not find container \"2e532231fb9e05c5e12daaa7682ad61edb1cc2ae1641329effcd1ef595a0472f\": container with ID starting with 2e532231fb9e05c5e12daaa7682ad61edb1cc2ae1641329effcd1ef595a0472f not found: ID does not exist" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.842445 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.869903 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-697b559f65-2zvb5"] Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.890211 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-697b559f65-2zvb5"] Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.946148 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:15 crc kubenswrapper[4873]: E0219 10:03:15.948532 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" containerName="init" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.948566 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" containerName="init" Feb 19 10:03:15 crc kubenswrapper[4873]: E0219 10:03:15.948585 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" containerName="dnsmasq-dns" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.948595 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" containerName="dnsmasq-dns" Feb 19 10:03:15 crc kubenswrapper[4873]: E0219 10:03:15.948617 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="ceilometer-notification-agent" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.948627 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="ceilometer-notification-agent" Feb 19 10:03:15 crc kubenswrapper[4873]: E0219 10:03:15.948659 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="sg-core" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.948669 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="sg-core" Feb 19 10:03:15 crc kubenswrapper[4873]: E0219 10:03:15.948682 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="proxy-httpd" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.948689 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="proxy-httpd" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.951070 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="proxy-httpd" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.951181 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="ceilometer-notification-agent" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.951221 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" containerName="dnsmasq-dns" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.951247 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" containerName="sg-core" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.976427 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.976881 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.982479 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 10:03:15 crc kubenswrapper[4873]: I0219 10:03:15.982860 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.082340 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.082423 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-run-httpd\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.082458 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-scripts\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.082487 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-config-data\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.082539 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.082565 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-log-httpd\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.082583 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxxwz\" (UniqueName: \"kubernetes.io/projected/437f5e56-e7c4-4280-9f75-2cf9e2496375-kube-api-access-sxxwz\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.184162 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.184238 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-log-httpd\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.184256 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxxwz\" (UniqueName: \"kubernetes.io/projected/437f5e56-e7c4-4280-9f75-2cf9e2496375-kube-api-access-sxxwz\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.184331 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.184399 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-run-httpd\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.184446 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-scripts\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.184478 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-config-data\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.185904 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-run-httpd\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.186219 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-log-httpd\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.191762 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.192081 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-scripts\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.193613 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.199024 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-config-data\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.213749 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxxwz\" (UniqueName: \"kubernetes.io/projected/437f5e56-e7c4-4280-9f75-2cf9e2496375-kube-api-access-sxxwz\") pod \"ceilometer-0\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.330190 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.516618 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8786cefd-adc3-4acf-bc04-066bc0510131","Type":"ContainerStarted","Data":"f05593f5088b36a20866dde0c189f6365ca5bb5d444303ca92f1e7f75e70ca2f"} Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.518338 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a649f7b-88cb-4b43-bc71-06ab3237f955","Type":"ContainerStarted","Data":"7a2c1898a31bc3c3c98e2419cf3054dc1ef8c249f821db6f4293689243122783"} Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.526628 4873 generic.go:334] "Generic (PLEG): container finished" podID="e78542dc-01da-47dc-aec5-a380b7484425" containerID="20fe864189fb33810eb3acc7dc0b89314091b0776fb2a2bfe18804bc13374185" exitCode=0 Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.526714 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" event={"ID":"e78542dc-01da-47dc-aec5-a380b7484425","Type":"ContainerDied","Data":"20fe864189fb33810eb3acc7dc0b89314091b0776fb2a2bfe18804bc13374185"} Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.595465 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.766666 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6687d9896d-v96j2" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.856228 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-87df9b646-2jf26"] Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.856459 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-87df9b646-2jf26" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon-log" containerID="cri-o://1a1b6f4ba694daddb17f029a0bbce06c79e8294e69f096dade9d91ac98c03f81" gracePeriod=30 Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.856835 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-87df9b646-2jf26" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon" containerID="cri-o://f728a5cace0f3c84844ee9bd7c5a0c48b5b5cad808dd5c682427cb942eb77db6" gracePeriod=30 Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.870270 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-87df9b646-2jf26" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Feb 19 10:03:16 crc kubenswrapper[4873]: I0219 10:03:16.899361 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.494836 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfe8fe2-4637-46d9-b3bc-689e510c6ec8" path="/var/lib/kubelet/pods/6dfe8fe2-4637-46d9-b3bc-689e510c6ec8/volumes" Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.497544 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab448dfd-a67c-49b5-a153-92a5a6f504b2" path="/var/lib/kubelet/pods/ab448dfd-a67c-49b5-a153-92a5a6f504b2/volumes" Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.554569 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a649f7b-88cb-4b43-bc71-06ab3237f955","Type":"ContainerStarted","Data":"be36025eab8065799686eea6527ff5c2140eda51908e0f229e70d6b1bf945a72"} Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.557309 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"437f5e56-e7c4-4280-9f75-2cf9e2496375","Type":"ContainerStarted","Data":"bde159d1e713bedb9f809d72fb731177094209d5eae3fafbca00c3bf92dd0edf"} Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.557347 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"437f5e56-e7c4-4280-9f75-2cf9e2496375","Type":"ContainerStarted","Data":"a72128a70548416ae211c60013a87319728fca02cd7888fa60778dec8ba63ea4"} Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.559361 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" event={"ID":"e78542dc-01da-47dc-aec5-a380b7484425","Type":"ContainerStarted","Data":"89333ead7926aa2618b4a528f8056c909774aca95003b189eba1f9e1ae277295"} Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.559511 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.561294 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8786cefd-adc3-4acf-bc04-066bc0510131","Type":"ContainerStarted","Data":"d65db798423e51c2a9f8d6a3012c9ccc857964209249f5fc3748a28883833968"} Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.561397 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8786cefd-adc3-4acf-bc04-066bc0510131" containerName="cinder-api-log" containerID="cri-o://f05593f5088b36a20866dde0c189f6365ca5bb5d444303ca92f1e7f75e70ca2f" gracePeriod=30 Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.561432 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.561468 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8786cefd-adc3-4acf-bc04-066bc0510131" containerName="cinder-api" containerID="cri-o://d65db798423e51c2a9f8d6a3012c9ccc857964209249f5fc3748a28883833968" gracePeriod=30 Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.590175 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.2054308559999996 podStartE2EDuration="4.590156927s" podCreationTimestamp="2026-02-19 10:03:13 +0000 UTC" firstStartedPulling="2026-02-19 10:03:14.76318862 +0000 UTC m=+1104.052620258" lastFinishedPulling="2026-02-19 10:03:15.147914691 +0000 UTC m=+1104.437346329" observedRunningTime="2026-02-19 10:03:17.581029959 +0000 UTC m=+1106.870461597" watchObservedRunningTime="2026-02-19 10:03:17.590156927 +0000 UTC m=+1106.879588565" Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.616035 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" podStartSLOduration=4.616016764 podStartE2EDuration="4.616016764s" podCreationTimestamp="2026-02-19 10:03:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:17.610520077 +0000 UTC m=+1106.899951715" watchObservedRunningTime="2026-02-19 10:03:17.616016764 +0000 UTC m=+1106.905448402" Feb 19 10:03:17 crc kubenswrapper[4873]: I0219 10:03:17.644450 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.644430245 podStartE2EDuration="4.644430245s" podCreationTimestamp="2026-02-19 10:03:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:17.639747158 +0000 UTC m=+1106.929178796" watchObservedRunningTime="2026-02-19 10:03:17.644430245 +0000 UTC m=+1106.933861883" Feb 19 10:03:18 crc kubenswrapper[4873]: I0219 10:03:18.012858 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-87df9b646-2jf26" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:59670->10.217.0.161:8443: read: connection reset by peer" Feb 19 10:03:18 crc kubenswrapper[4873]: E0219 10:03:18.233275 4873 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcace1157_1459_4823_aa8f_b2c246d3adeb.slice/crio-conmon-f728a5cace0f3c84844ee9bd7c5a0c48b5b5cad808dd5c682427cb942eb77db6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcace1157_1459_4823_aa8f_b2c246d3adeb.slice/crio-f728a5cace0f3c84844ee9bd7c5a0c48b5b5cad808dd5c682427cb942eb77db6.scope\": RecentStats: unable to find data in memory cache]" Feb 19 10:03:18 crc kubenswrapper[4873]: I0219 10:03:18.240247 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:03:18 crc kubenswrapper[4873]: I0219 10:03:18.240301 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:03:18 crc kubenswrapper[4873]: I0219 10:03:18.484009 4873 scope.go:117] "RemoveContainer" containerID="5d8a6efa61f7a8c09f644a4fe742859469cc638e71ee46cb830c7a9e3cf72be0" Feb 19 10:03:18 crc kubenswrapper[4873]: I0219 10:03:18.572981 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"437f5e56-e7c4-4280-9f75-2cf9e2496375","Type":"ContainerStarted","Data":"ea2c29e9b6a6ae2111af938dab80bf0fe86bf1c95d1132e043df7ce04e75e6e4"} Feb 19 10:03:18 crc kubenswrapper[4873]: I0219 10:03:18.573030 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"437f5e56-e7c4-4280-9f75-2cf9e2496375","Type":"ContainerStarted","Data":"3858e9a302b9c6afbe14985a270c0ff8aac1476a89852d70300dc30e43cfb9a9"} Feb 19 10:03:18 crc kubenswrapper[4873]: I0219 10:03:18.576832 4873 generic.go:334] "Generic (PLEG): container finished" podID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerID="f728a5cace0f3c84844ee9bd7c5a0c48b5b5cad808dd5c682427cb942eb77db6" exitCode=0 Feb 19 10:03:18 crc kubenswrapper[4873]: I0219 10:03:18.576885 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87df9b646-2jf26" event={"ID":"cace1157-1459-4823-aa8f-b2c246d3adeb","Type":"ContainerDied","Data":"f728a5cace0f3c84844ee9bd7c5a0c48b5b5cad808dd5c682427cb942eb77db6"} Feb 19 10:03:18 crc kubenswrapper[4873]: I0219 10:03:18.579510 4873 generic.go:334] "Generic (PLEG): container finished" podID="8786cefd-adc3-4acf-bc04-066bc0510131" containerID="f05593f5088b36a20866dde0c189f6365ca5bb5d444303ca92f1e7f75e70ca2f" exitCode=143 Feb 19 10:03:18 crc kubenswrapper[4873]: I0219 10:03:18.580492 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8786cefd-adc3-4acf-bc04-066bc0510131","Type":"ContainerDied","Data":"f05593f5088b36a20866dde0c189f6365ca5bb5d444303ca92f1e7f75e70ca2f"} Feb 19 10:03:19 crc kubenswrapper[4873]: I0219 10:03:19.009092 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 10:03:19 crc kubenswrapper[4873]: I0219 10:03:19.592870 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"ab7f7779-d6dd-4844-8af5-83ade972d9d0","Type":"ContainerStarted","Data":"cc54b787c703c958a190db022b86cf50c377c895c0e9b21773e78b4356509d96"} Feb 19 10:03:20 crc kubenswrapper[4873]: I0219 10:03:20.842116 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-87df9b646-2jf26" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Feb 19 10:03:21 crc kubenswrapper[4873]: I0219 10:03:21.614578 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"437f5e56-e7c4-4280-9f75-2cf9e2496375","Type":"ContainerStarted","Data":"7e5158670e3a62976761b268e07eaed28eb3274621570c9d17f2e4ba96a59299"} Feb 19 10:03:21 crc kubenswrapper[4873]: I0219 10:03:21.614732 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 10:03:21 crc kubenswrapper[4873]: I0219 10:03:21.641480 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.12694094 podStartE2EDuration="6.641453901s" podCreationTimestamp="2026-02-19 10:03:15 +0000 UTC" firstStartedPulling="2026-02-19 10:03:16.932422679 +0000 UTC m=+1106.221854327" lastFinishedPulling="2026-02-19 10:03:20.446935649 +0000 UTC m=+1109.736367288" observedRunningTime="2026-02-19 10:03:21.634805335 +0000 UTC m=+1110.924236973" watchObservedRunningTime="2026-02-19 10:03:21.641453901 +0000 UTC m=+1110.930885539" Feb 19 10:03:22 crc kubenswrapper[4873]: I0219 10:03:22.193653 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:22 crc kubenswrapper[4873]: I0219 10:03:22.197154 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c4d59d6dd-4nh9w" Feb 19 10:03:22 crc kubenswrapper[4873]: I0219 10:03:22.306158 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-785b79c884-tswfl"] Feb 19 10:03:22 crc kubenswrapper[4873]: I0219 10:03:22.306462 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-785b79c884-tswfl" podUID="0d2df48a-78aa-4711-a0ac-268542093658" containerName="barbican-api-log" containerID="cri-o://01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611" gracePeriod=30 Feb 19 10:03:22 crc kubenswrapper[4873]: I0219 10:03:22.306569 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-785b79c884-tswfl" podUID="0d2df48a-78aa-4711-a0ac-268542093658" containerName="barbican-api" containerID="cri-o://626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18" gracePeriod=30 Feb 19 10:03:22 crc kubenswrapper[4873]: I0219 10:03:22.624334 4873 generic.go:334] "Generic (PLEG): container finished" podID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerID="cc54b787c703c958a190db022b86cf50c377c895c0e9b21773e78b4356509d96" exitCode=1 Feb 19 10:03:22 crc kubenswrapper[4873]: I0219 10:03:22.624389 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"ab7f7779-d6dd-4844-8af5-83ade972d9d0","Type":"ContainerDied","Data":"cc54b787c703c958a190db022b86cf50c377c895c0e9b21773e78b4356509d96"} Feb 19 10:03:22 crc kubenswrapper[4873]: I0219 10:03:22.624418 4873 scope.go:117] "RemoveContainer" containerID="5d8a6efa61f7a8c09f644a4fe742859469cc638e71ee46cb830c7a9e3cf72be0" Feb 19 10:03:22 crc kubenswrapper[4873]: I0219 10:03:22.625002 4873 scope.go:117] "RemoveContainer" containerID="cc54b787c703c958a190db022b86cf50c377c895c0e9b21773e78b4356509d96" Feb 19 10:03:22 crc kubenswrapper[4873]: E0219 10:03:22.625238 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(ab7f7779-d6dd-4844-8af5-83ade972d9d0)\"" pod="openstack/watcher-decision-engine-0" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" Feb 19 10:03:22 crc kubenswrapper[4873]: I0219 10:03:22.634671 4873 generic.go:334] "Generic (PLEG): container finished" podID="0d2df48a-78aa-4711-a0ac-268542093658" containerID="01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611" exitCode=143 Feb 19 10:03:22 crc kubenswrapper[4873]: I0219 10:03:22.634734 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-785b79c884-tswfl" event={"ID":"0d2df48a-78aa-4711-a0ac-268542093658","Type":"ContainerDied","Data":"01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611"} Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.243846 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.298178 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.349237 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.424808 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5687f4c549-n4g4v"] Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.425037 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" podUID="64deb684-42f6-4bb5-b774-ef57839a56d5" containerName="dnsmasq-dns" containerID="cri-o://b6d3e647789f4d02e72fca723b661c289177f805a67c5edd75747ee3947add92" gracePeriod=10 Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.608810 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.667426 4873 generic.go:334] "Generic (PLEG): container finished" podID="0d2df48a-78aa-4711-a0ac-268542093658" containerID="626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18" exitCode=0 Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.667493 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-785b79c884-tswfl" event={"ID":"0d2df48a-78aa-4711-a0ac-268542093658","Type":"ContainerDied","Data":"626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18"} Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.667522 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-785b79c884-tswfl" event={"ID":"0d2df48a-78aa-4711-a0ac-268542093658","Type":"ContainerDied","Data":"aa044ff7142e0c26ee94862c0e4c5ca488a9ed1c7a1ffa3af69735d62ea70cbd"} Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.667539 4873 scope.go:117] "RemoveContainer" containerID="626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.667646 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-785b79c884-tswfl" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.679485 4873 generic.go:334] "Generic (PLEG): container finished" podID="64deb684-42f6-4bb5-b774-ef57839a56d5" containerID="b6d3e647789f4d02e72fca723b661c289177f805a67c5edd75747ee3947add92" exitCode=0 Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.679719 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5a649f7b-88cb-4b43-bc71-06ab3237f955" containerName="cinder-scheduler" containerID="cri-o://7a2c1898a31bc3c3c98e2419cf3054dc1ef8c249f821db6f4293689243122783" gracePeriod=30 Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.680039 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" event={"ID":"64deb684-42f6-4bb5-b774-ef57839a56d5","Type":"ContainerDied","Data":"b6d3e647789f4d02e72fca723b661c289177f805a67c5edd75747ee3947add92"} Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.680563 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5a649f7b-88cb-4b43-bc71-06ab3237f955" containerName="probe" containerID="cri-o://be36025eab8065799686eea6527ff5c2140eda51908e0f229e70d6b1bf945a72" gracePeriod=30 Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.684640 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlb7v\" (UniqueName: \"kubernetes.io/projected/0d2df48a-78aa-4711-a0ac-268542093658-kube-api-access-mlb7v\") pod \"0d2df48a-78aa-4711-a0ac-268542093658\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.684855 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-combined-ca-bundle\") pod \"0d2df48a-78aa-4711-a0ac-268542093658\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.684915 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d2df48a-78aa-4711-a0ac-268542093658-logs\") pod \"0d2df48a-78aa-4711-a0ac-268542093658\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.684959 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data\") pod \"0d2df48a-78aa-4711-a0ac-268542093658\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.685003 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data-custom\") pod \"0d2df48a-78aa-4711-a0ac-268542093658\" (UID: \"0d2df48a-78aa-4711-a0ac-268542093658\") " Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.687000 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d2df48a-78aa-4711-a0ac-268542093658-logs" (OuterVolumeSpecName: "logs") pod "0d2df48a-78aa-4711-a0ac-268542093658" (UID: "0d2df48a-78aa-4711-a0ac-268542093658"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.690920 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0d2df48a-78aa-4711-a0ac-268542093658" (UID: "0d2df48a-78aa-4711-a0ac-268542093658"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.693277 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d2df48a-78aa-4711-a0ac-268542093658-kube-api-access-mlb7v" (OuterVolumeSpecName: "kube-api-access-mlb7v") pod "0d2df48a-78aa-4711-a0ac-268542093658" (UID: "0d2df48a-78aa-4711-a0ac-268542093658"). InnerVolumeSpecName "kube-api-access-mlb7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.728227 4873 scope.go:117] "RemoveContainer" containerID="01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.764330 4873 scope.go:117] "RemoveContainer" containerID="626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.764347 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data" (OuterVolumeSpecName: "config-data") pod "0d2df48a-78aa-4711-a0ac-268542093658" (UID: "0d2df48a-78aa-4711-a0ac-268542093658"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.764447 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d2df48a-78aa-4711-a0ac-268542093658" (UID: "0d2df48a-78aa-4711-a0ac-268542093658"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:24 crc kubenswrapper[4873]: E0219 10:03:24.764961 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18\": container with ID starting with 626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18 not found: ID does not exist" containerID="626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.764991 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18"} err="failed to get container status \"626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18\": rpc error: code = NotFound desc = could not find container \"626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18\": container with ID starting with 626eef1424fde36fd6c591537282df1773ca875ce665ae3dc8e63713a1a95d18 not found: ID does not exist" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.765011 4873 scope.go:117] "RemoveContainer" containerID="01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611" Feb 19 10:03:24 crc kubenswrapper[4873]: E0219 10:03:24.765235 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611\": container with ID starting with 01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611 not found: ID does not exist" containerID="01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.765255 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611"} err="failed to get container status \"01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611\": rpc error: code = NotFound desc = could not find container \"01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611\": container with ID starting with 01bd16909d7a2865478dd541aeaed2f91ba94845036b47bd03dd48be7c376611 not found: ID does not exist" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.791705 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.791746 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d2df48a-78aa-4711-a0ac-268542093658-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.791761 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.791769 4873 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d2df48a-78aa-4711-a0ac-268542093658-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.791780 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlb7v\" (UniqueName: \"kubernetes.io/projected/0d2df48a-78aa-4711-a0ac-268542093658-kube-api-access-mlb7v\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:24 crc kubenswrapper[4873]: I0219 10:03:24.974158 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.051163 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-785b79c884-tswfl"] Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.068138 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-785b79c884-tswfl"] Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.098310 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-swift-storage-0\") pod \"64deb684-42f6-4bb5-b774-ef57839a56d5\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.098952 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9mv6\" (UniqueName: \"kubernetes.io/projected/64deb684-42f6-4bb5-b774-ef57839a56d5-kube-api-access-x9mv6\") pod \"64deb684-42f6-4bb5-b774-ef57839a56d5\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.099018 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-sb\") pod \"64deb684-42f6-4bb5-b774-ef57839a56d5\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.099050 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-config\") pod \"64deb684-42f6-4bb5-b774-ef57839a56d5\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.099131 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-nb\") pod \"64deb684-42f6-4bb5-b774-ef57839a56d5\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.099250 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-svc\") pod \"64deb684-42f6-4bb5-b774-ef57839a56d5\" (UID: \"64deb684-42f6-4bb5-b774-ef57839a56d5\") " Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.115362 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64deb684-42f6-4bb5-b774-ef57839a56d5-kube-api-access-x9mv6" (OuterVolumeSpecName: "kube-api-access-x9mv6") pod "64deb684-42f6-4bb5-b774-ef57839a56d5" (UID: "64deb684-42f6-4bb5-b774-ef57839a56d5"). InnerVolumeSpecName "kube-api-access-x9mv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.144716 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "64deb684-42f6-4bb5-b774-ef57839a56d5" (UID: "64deb684-42f6-4bb5-b774-ef57839a56d5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.161770 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "64deb684-42f6-4bb5-b774-ef57839a56d5" (UID: "64deb684-42f6-4bb5-b774-ef57839a56d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.164279 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-config" (OuterVolumeSpecName: "config") pod "64deb684-42f6-4bb5-b774-ef57839a56d5" (UID: "64deb684-42f6-4bb5-b774-ef57839a56d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.165130 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "64deb684-42f6-4bb5-b774-ef57839a56d5" (UID: "64deb684-42f6-4bb5-b774-ef57839a56d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.189529 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "64deb684-42f6-4bb5-b774-ef57839a56d5" (UID: "64deb684-42f6-4bb5-b774-ef57839a56d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.202628 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.202668 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.202678 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.202686 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.202695 4873 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64deb684-42f6-4bb5-b774-ef57839a56d5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.202704 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9mv6\" (UniqueName: \"kubernetes.io/projected/64deb684-42f6-4bb5-b774-ef57839a56d5-kube-api-access-x9mv6\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.498398 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d2df48a-78aa-4711-a0ac-268542093658" path="/var/lib/kubelet/pods/0d2df48a-78aa-4711-a0ac-268542093658/volumes" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.689328 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" event={"ID":"64deb684-42f6-4bb5-b774-ef57839a56d5","Type":"ContainerDied","Data":"de0152a57feb0720c3ff97d1d52995f66e5e8c9b3cc0aff67e6dfa65b92a668a"} Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.689382 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5687f4c549-n4g4v" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.689423 4873 scope.go:117] "RemoveContainer" containerID="b6d3e647789f4d02e72fca723b661c289177f805a67c5edd75747ee3947add92" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.691622 4873 generic.go:334] "Generic (PLEG): container finished" podID="5a649f7b-88cb-4b43-bc71-06ab3237f955" containerID="be36025eab8065799686eea6527ff5c2140eda51908e0f229e70d6b1bf945a72" exitCode=0 Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.691664 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a649f7b-88cb-4b43-bc71-06ab3237f955","Type":"ContainerDied","Data":"be36025eab8065799686eea6527ff5c2140eda51908e0f229e70d6b1bf945a72"} Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.711424 4873 scope.go:117] "RemoveContainer" containerID="2156fdadae7d71bb536233ced37bfe76646867be4fb2b42c0784cff65fb2da11" Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.737510 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5687f4c549-n4g4v"] Feb 19 10:03:25 crc kubenswrapper[4873]: I0219 10:03:25.746277 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5687f4c549-n4g4v"] Feb 19 10:03:27 crc kubenswrapper[4873]: I0219 10:03:27.042343 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 10:03:27 crc kubenswrapper[4873]: I0219 10:03:27.140325 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:27 crc kubenswrapper[4873]: I0219 10:03:27.140391 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:27 crc kubenswrapper[4873]: I0219 10:03:27.140405 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:27 crc kubenswrapper[4873]: I0219 10:03:27.140420 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:27 crc kubenswrapper[4873]: I0219 10:03:27.141192 4873 scope.go:117] "RemoveContainer" containerID="cc54b787c703c958a190db022b86cf50c377c895c0e9b21773e78b4356509d96" Feb 19 10:03:27 crc kubenswrapper[4873]: E0219 10:03:27.141470 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(ab7f7779-d6dd-4844-8af5-83ade972d9d0)\"" pod="openstack/watcher-decision-engine-0" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" Feb 19 10:03:27 crc kubenswrapper[4873]: I0219 10:03:27.494744 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64deb684-42f6-4bb5-b774-ef57839a56d5" path="/var/lib/kubelet/pods/64deb684-42f6-4bb5-b774-ef57839a56d5/volumes" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.490976 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5fcd445c48-xvpw4" Feb 19 10:03:28 crc kubenswrapper[4873]: E0219 10:03:28.499880 4873 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a649f7b_88cb_4b43_bc71_06ab3237f955.slice/crio-7a2c1898a31bc3c3c98e2419cf3054dc1ef8c249f821db6f4293689243122783.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a649f7b_88cb_4b43_bc71_06ab3237f955.slice/crio-conmon-7a2c1898a31bc3c3c98e2419cf3054dc1ef8c249f821db6f4293689243122783.scope\": RecentStats: unable to find data in memory cache]" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.773862 4873 generic.go:334] "Generic (PLEG): container finished" podID="5a649f7b-88cb-4b43-bc71-06ab3237f955" containerID="7a2c1898a31bc3c3c98e2419cf3054dc1ef8c249f821db6f4293689243122783" exitCode=0 Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.774066 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a649f7b-88cb-4b43-bc71-06ab3237f955","Type":"ContainerDied","Data":"7a2c1898a31bc3c3c98e2419cf3054dc1ef8c249f821db6f4293689243122783"} Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.858923 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 10:03:28 crc kubenswrapper[4873]: E0219 10:03:28.859495 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64deb684-42f6-4bb5-b774-ef57839a56d5" containerName="init" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.859520 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="64deb684-42f6-4bb5-b774-ef57839a56d5" containerName="init" Feb 19 10:03:28 crc kubenswrapper[4873]: E0219 10:03:28.859536 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2df48a-78aa-4711-a0ac-268542093658" containerName="barbican-api-log" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.859543 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2df48a-78aa-4711-a0ac-268542093658" containerName="barbican-api-log" Feb 19 10:03:28 crc kubenswrapper[4873]: E0219 10:03:28.859557 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2df48a-78aa-4711-a0ac-268542093658" containerName="barbican-api" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.859562 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2df48a-78aa-4711-a0ac-268542093658" containerName="barbican-api" Feb 19 10:03:28 crc kubenswrapper[4873]: E0219 10:03:28.859589 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64deb684-42f6-4bb5-b774-ef57839a56d5" containerName="dnsmasq-dns" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.859595 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="64deb684-42f6-4bb5-b774-ef57839a56d5" containerName="dnsmasq-dns" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.859785 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2df48a-78aa-4711-a0ac-268542093658" containerName="barbican-api" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.859804 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="64deb684-42f6-4bb5-b774-ef57839a56d5" containerName="dnsmasq-dns" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.859829 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2df48a-78aa-4711-a0ac-268542093658" containerName="barbican-api-log" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.860616 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.865203 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-5plx7" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.865633 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.865874 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.874444 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.969740 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.983658 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkvrr\" (UniqueName: \"kubernetes.io/projected/5c4eb2b5-d272-49ff-938e-3e3359d29f46-kube-api-access-xkvrr\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.983722 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c4eb2b5-d272-49ff-938e-3e3359d29f46-openstack-config-secret\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.983761 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4eb2b5-d272-49ff-938e-3e3359d29f46-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:28 crc kubenswrapper[4873]: I0219 10:03:28.983796 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5c4eb2b5-d272-49ff-938e-3e3359d29f46-openstack-config\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.085122 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data-custom\") pod \"5a649f7b-88cb-4b43-bc71-06ab3237f955\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.085214 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-combined-ca-bundle\") pod \"5a649f7b-88cb-4b43-bc71-06ab3237f955\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.085262 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-scripts\") pod \"5a649f7b-88cb-4b43-bc71-06ab3237f955\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.085324 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94s6t\" (UniqueName: \"kubernetes.io/projected/5a649f7b-88cb-4b43-bc71-06ab3237f955-kube-api-access-94s6t\") pod \"5a649f7b-88cb-4b43-bc71-06ab3237f955\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.085372 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a649f7b-88cb-4b43-bc71-06ab3237f955-etc-machine-id\") pod \"5a649f7b-88cb-4b43-bc71-06ab3237f955\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.085445 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data\") pod \"5a649f7b-88cb-4b43-bc71-06ab3237f955\" (UID: \"5a649f7b-88cb-4b43-bc71-06ab3237f955\") " Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.085641 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5c4eb2b5-d272-49ff-938e-3e3359d29f46-openstack-config\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.085758 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkvrr\" (UniqueName: \"kubernetes.io/projected/5c4eb2b5-d272-49ff-938e-3e3359d29f46-kube-api-access-xkvrr\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.085799 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c4eb2b5-d272-49ff-938e-3e3359d29f46-openstack-config-secret\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.085833 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4eb2b5-d272-49ff-938e-3e3359d29f46-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.085849 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a649f7b-88cb-4b43-bc71-06ab3237f955-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5a649f7b-88cb-4b43-bc71-06ab3237f955" (UID: "5a649f7b-88cb-4b43-bc71-06ab3237f955"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.086981 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5c4eb2b5-d272-49ff-938e-3e3359d29f46-openstack-config\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.091339 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5c4eb2b5-d272-49ff-938e-3e3359d29f46-openstack-config-secret\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.095647 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4eb2b5-d272-49ff-938e-3e3359d29f46-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.104255 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a649f7b-88cb-4b43-bc71-06ab3237f955-kube-api-access-94s6t" (OuterVolumeSpecName: "kube-api-access-94s6t") pod "5a649f7b-88cb-4b43-bc71-06ab3237f955" (UID: "5a649f7b-88cb-4b43-bc71-06ab3237f955"). InnerVolumeSpecName "kube-api-access-94s6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.104356 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-scripts" (OuterVolumeSpecName: "scripts") pod "5a649f7b-88cb-4b43-bc71-06ab3237f955" (UID: "5a649f7b-88cb-4b43-bc71-06ab3237f955"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.104422 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5a649f7b-88cb-4b43-bc71-06ab3237f955" (UID: "5a649f7b-88cb-4b43-bc71-06ab3237f955"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.104596 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkvrr\" (UniqueName: \"kubernetes.io/projected/5c4eb2b5-d272-49ff-938e-3e3359d29f46-kube-api-access-xkvrr\") pod \"openstackclient\" (UID: \"5c4eb2b5-d272-49ff-938e-3e3359d29f46\") " pod="openstack/openstackclient" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.144184 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a649f7b-88cb-4b43-bc71-06ab3237f955" (UID: "5a649f7b-88cb-4b43-bc71-06ab3237f955"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.187342 4873 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a649f7b-88cb-4b43-bc71-06ab3237f955-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.187375 4873 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.187385 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.187395 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.187404 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94s6t\" (UniqueName: \"kubernetes.io/projected/5a649f7b-88cb-4b43-bc71-06ab3237f955-kube-api-access-94s6t\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.211781 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data" (OuterVolumeSpecName: "config-data") pod "5a649f7b-88cb-4b43-bc71-06ab3237f955" (UID: "5a649f7b-88cb-4b43-bc71-06ab3237f955"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.266140 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.288890 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a649f7b-88cb-4b43-bc71-06ab3237f955-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.784162 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a649f7b-88cb-4b43-bc71-06ab3237f955","Type":"ContainerDied","Data":"d01b29d0a705eb2d4f63512fcea22d2e1a868ce32ce3cff9e21615cd03cc670a"} Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.784226 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.784440 4873 scope.go:117] "RemoveContainer" containerID="be36025eab8065799686eea6527ff5c2140eda51908e0f229e70d6b1bf945a72" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.810413 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.826696 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.836301 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.838784 4873 scope.go:117] "RemoveContainer" containerID="7a2c1898a31bc3c3c98e2419cf3054dc1ef8c249f821db6f4293689243122783" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.877132 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 10:03:29 crc kubenswrapper[4873]: E0219 10:03:29.877601 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a649f7b-88cb-4b43-bc71-06ab3237f955" containerName="probe" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.877622 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a649f7b-88cb-4b43-bc71-06ab3237f955" containerName="probe" Feb 19 10:03:29 crc kubenswrapper[4873]: E0219 10:03:29.877643 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a649f7b-88cb-4b43-bc71-06ab3237f955" containerName="cinder-scheduler" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.877651 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a649f7b-88cb-4b43-bc71-06ab3237f955" containerName="cinder-scheduler" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.877871 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a649f7b-88cb-4b43-bc71-06ab3237f955" containerName="probe" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.877912 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a649f7b-88cb-4b43-bc71-06ab3237f955" containerName="cinder-scheduler" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.879239 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.885044 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 10:03:29 crc kubenswrapper[4873]: I0219 10:03:29.891235 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.009170 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.009247 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.009308 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.009383 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8npk\" (UniqueName: \"kubernetes.io/projected/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-kube-api-access-g8npk\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.009414 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.009437 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.111053 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.111132 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8npk\" (UniqueName: \"kubernetes.io/projected/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-kube-api-access-g8npk\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.111165 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.111191 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.111241 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.111279 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.111688 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.116950 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.117554 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.120144 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.122356 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.131080 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8npk\" (UniqueName: \"kubernetes.io/projected/cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1-kube-api-access-g8npk\") pod \"cinder-scheduler-0\" (UID: \"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1\") " pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.241361 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.649663 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.687843 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6696d67b98-wrvnm" Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.797309 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 10:03:30 crc kubenswrapper[4873]: W0219 10:03:30.802335 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd9b32e6_4f78_4f9c_9fbd_e91b37d110a1.slice/crio-9ad99f1c11152aa7d83b4e1f00e4c64fe31ea74d537313adc0861ddbf7260217 WatchSource:0}: Error finding container 9ad99f1c11152aa7d83b4e1f00e4c64fe31ea74d537313adc0861ddbf7260217: Status 404 returned error can't find the container with id 9ad99f1c11152aa7d83b4e1f00e4c64fe31ea74d537313adc0861ddbf7260217 Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.803223 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5c4eb2b5-d272-49ff-938e-3e3359d29f46","Type":"ContainerStarted","Data":"e2481c1a64880156900e4d68507e2c14f2f2aad465852c3198aa97ae6f916e45"} Feb 19 10:03:30 crc kubenswrapper[4873]: I0219 10:03:30.842727 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-87df9b646-2jf26" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Feb 19 10:03:31 crc kubenswrapper[4873]: I0219 10:03:31.502026 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a649f7b-88cb-4b43-bc71-06ab3237f955" path="/var/lib/kubelet/pods/5a649f7b-88cb-4b43-bc71-06ab3237f955/volumes" Feb 19 10:03:31 crc kubenswrapper[4873]: I0219 10:03:31.824961 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1","Type":"ContainerStarted","Data":"abdf2115fa92347ecdc390e94a1e3534c23229e1ed23bae8a1715190c991fba3"} Feb 19 10:03:31 crc kubenswrapper[4873]: I0219 10:03:31.825004 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1","Type":"ContainerStarted","Data":"9ad99f1c11152aa7d83b4e1f00e4c64fe31ea74d537313adc0861ddbf7260217"} Feb 19 10:03:32 crc kubenswrapper[4873]: I0219 10:03:32.851103 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1","Type":"ContainerStarted","Data":"46a182591dec41dd73b8c2c2e862a2acb3808dded2e789b0b636592adcb8b401"} Feb 19 10:03:32 crc kubenswrapper[4873]: I0219 10:03:32.877385 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.877359028 podStartE2EDuration="3.877359028s" podCreationTimestamp="2026-02-19 10:03:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:32.868778184 +0000 UTC m=+1122.158209832" watchObservedRunningTime="2026-02-19 10:03:32.877359028 +0000 UTC m=+1122.166790666" Feb 19 10:03:33 crc kubenswrapper[4873]: I0219 10:03:33.821528 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:35 crc kubenswrapper[4873]: I0219 10:03:35.241660 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 10:03:35 crc kubenswrapper[4873]: I0219 10:03:35.866302 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:35 crc kubenswrapper[4873]: I0219 10:03:35.866649 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="ceilometer-central-agent" containerID="cri-o://bde159d1e713bedb9f809d72fb731177094209d5eae3fafbca00c3bf92dd0edf" gracePeriod=30 Feb 19 10:03:35 crc kubenswrapper[4873]: I0219 10:03:35.866723 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="sg-core" containerID="cri-o://ea2c29e9b6a6ae2111af938dab80bf0fe86bf1c95d1132e043df7ce04e75e6e4" gracePeriod=30 Feb 19 10:03:35 crc kubenswrapper[4873]: I0219 10:03:35.866765 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="proxy-httpd" containerID="cri-o://7e5158670e3a62976761b268e07eaed28eb3274621570c9d17f2e4ba96a59299" gracePeriod=30 Feb 19 10:03:35 crc kubenswrapper[4873]: I0219 10:03:35.866766 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="ceilometer-notification-agent" containerID="cri-o://3858e9a302b9c6afbe14985a270c0ff8aac1476a89852d70300dc30e43cfb9a9" gracePeriod=30 Feb 19 10:03:35 crc kubenswrapper[4873]: I0219 10:03:35.889940 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.186:3000/\": EOF" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.427432 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7c6d694569-qbpxm"] Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.429260 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.432574 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.432687 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.433432 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.448300 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7c6d694569-qbpxm"] Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.576497 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-combined-ca-bundle\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.576554 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d51beb70-e455-4e75-9e06-863b41fbf9a8-log-httpd\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.576604 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-config-data\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.576629 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-internal-tls-certs\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.576680 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72rmb\" (UniqueName: \"kubernetes.io/projected/d51beb70-e455-4e75-9e06-863b41fbf9a8-kube-api-access-72rmb\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.576720 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d51beb70-e455-4e75-9e06-863b41fbf9a8-run-httpd\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.576738 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-public-tls-certs\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.576757 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d51beb70-e455-4e75-9e06-863b41fbf9a8-etc-swift\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.678789 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d51beb70-e455-4e75-9e06-863b41fbf9a8-run-httpd\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.678862 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d51beb70-e455-4e75-9e06-863b41fbf9a8-etc-swift\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.678890 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-public-tls-certs\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.678982 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-combined-ca-bundle\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.679030 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d51beb70-e455-4e75-9e06-863b41fbf9a8-log-httpd\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.679102 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-config-data\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.679163 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-internal-tls-certs\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.679251 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72rmb\" (UniqueName: \"kubernetes.io/projected/d51beb70-e455-4e75-9e06-863b41fbf9a8-kube-api-access-72rmb\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.679448 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d51beb70-e455-4e75-9e06-863b41fbf9a8-run-httpd\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.679784 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d51beb70-e455-4e75-9e06-863b41fbf9a8-log-httpd\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.686161 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-internal-tls-certs\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.697943 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-config-data\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.699001 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d51beb70-e455-4e75-9e06-863b41fbf9a8-etc-swift\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.700302 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-public-tls-certs\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.701650 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d51beb70-e455-4e75-9e06-863b41fbf9a8-combined-ca-bundle\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.704626 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72rmb\" (UniqueName: \"kubernetes.io/projected/d51beb70-e455-4e75-9e06-863b41fbf9a8-kube-api-access-72rmb\") pod \"swift-proxy-7c6d694569-qbpxm\" (UID: \"d51beb70-e455-4e75-9e06-863b41fbf9a8\") " pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.719628 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-76cc4fb9fc-vdfd4" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.760652 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.836135 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-749b6895f6-pmvtl"] Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.837226 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-749b6895f6-pmvtl" podUID="3e2e96b4-be71-4257-a1ed-0c7427ed0e64" containerName="neutron-httpd" containerID="cri-o://a79fc5280241e2b1290e7c529d75e24bfc2e34924a4a60f5635bd08c8d066317" gracePeriod=30 Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.836766 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-749b6895f6-pmvtl" podUID="3e2e96b4-be71-4257-a1ed-0c7427ed0e64" containerName="neutron-api" containerID="cri-o://c2fc7030796f36afab7f9dbbf523310f26e22db5ef1e487e914036c0ac971b2a" gracePeriod=30 Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.904127 4873 generic.go:334] "Generic (PLEG): container finished" podID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerID="7e5158670e3a62976761b268e07eaed28eb3274621570c9d17f2e4ba96a59299" exitCode=0 Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.904168 4873 generic.go:334] "Generic (PLEG): container finished" podID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerID="ea2c29e9b6a6ae2111af938dab80bf0fe86bf1c95d1132e043df7ce04e75e6e4" exitCode=2 Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.904179 4873 generic.go:334] "Generic (PLEG): container finished" podID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerID="bde159d1e713bedb9f809d72fb731177094209d5eae3fafbca00c3bf92dd0edf" exitCode=0 Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.904202 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"437f5e56-e7c4-4280-9f75-2cf9e2496375","Type":"ContainerDied","Data":"7e5158670e3a62976761b268e07eaed28eb3274621570c9d17f2e4ba96a59299"} Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.904231 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"437f5e56-e7c4-4280-9f75-2cf9e2496375","Type":"ContainerDied","Data":"ea2c29e9b6a6ae2111af938dab80bf0fe86bf1c95d1132e043df7ce04e75e6e4"} Feb 19 10:03:36 crc kubenswrapper[4873]: I0219 10:03:36.904246 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"437f5e56-e7c4-4280-9f75-2cf9e2496375","Type":"ContainerDied","Data":"bde159d1e713bedb9f809d72fb731177094209d5eae3fafbca00c3bf92dd0edf"} Feb 19 10:03:37 crc kubenswrapper[4873]: I0219 10:03:37.935398 4873 generic.go:334] "Generic (PLEG): container finished" podID="3e2e96b4-be71-4257-a1ed-0c7427ed0e64" containerID="a79fc5280241e2b1290e7c529d75e24bfc2e34924a4a60f5635bd08c8d066317" exitCode=0 Feb 19 10:03:37 crc kubenswrapper[4873]: I0219 10:03:37.935445 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749b6895f6-pmvtl" event={"ID":"3e2e96b4-be71-4257-a1ed-0c7427ed0e64","Type":"ContainerDied","Data":"a79fc5280241e2b1290e7c529d75e24bfc2e34924a4a60f5635bd08c8d066317"} Feb 19 10:03:38 crc kubenswrapper[4873]: I0219 10:03:38.947536 4873 generic.go:334] "Generic (PLEG): container finished" podID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerID="3858e9a302b9c6afbe14985a270c0ff8aac1476a89852d70300dc30e43cfb9a9" exitCode=0 Feb 19 10:03:38 crc kubenswrapper[4873]: I0219 10:03:38.947586 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"437f5e56-e7c4-4280-9f75-2cf9e2496375","Type":"ContainerDied","Data":"3858e9a302b9c6afbe14985a270c0ff8aac1476a89852d70300dc30e43cfb9a9"} Feb 19 10:03:39 crc kubenswrapper[4873]: I0219 10:03:39.830270 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-cqfhq"] Feb 19 10:03:39 crc kubenswrapper[4873]: I0219 10:03:39.831692 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cqfhq" Feb 19 10:03:39 crc kubenswrapper[4873]: I0219 10:03:39.847735 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cqfhq"] Feb 19 10:03:39 crc kubenswrapper[4873]: I0219 10:03:39.937340 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-hbt9r"] Feb 19 10:03:39 crc kubenswrapper[4873]: I0219 10:03:39.939045 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hbt9r" Feb 19 10:03:39 crc kubenswrapper[4873]: I0219 10:03:39.956054 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hbt9r"] Feb 19 10:03:39 crc kubenswrapper[4873]: I0219 10:03:39.970363 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-operator-scripts\") pod \"nova-api-db-create-cqfhq\" (UID: \"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6\") " pod="openstack/nova-api-db-create-cqfhq" Feb 19 10:03:39 crc kubenswrapper[4873]: I0219 10:03:39.970528 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlp77\" (UniqueName: \"kubernetes.io/projected/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-kube-api-access-wlp77\") pod \"nova-api-db-create-cqfhq\" (UID: \"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6\") " pod="openstack/nova-api-db-create-cqfhq" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.071798 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d9dc-account-create-update-p9wrt"] Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.073689 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-operator-scripts\") pod \"nova-api-db-create-cqfhq\" (UID: \"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6\") " pod="openstack/nova-api-db-create-cqfhq" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.074125 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gctfv\" (UniqueName: \"kubernetes.io/projected/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-kube-api-access-gctfv\") pod \"nova-cell0-db-create-hbt9r\" (UID: \"b1d06337-fba1-4b9c-abbc-02f635fd3bdd\") " pod="openstack/nova-cell0-db-create-hbt9r" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.074295 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlp77\" (UniqueName: \"kubernetes.io/projected/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-kube-api-access-wlp77\") pod \"nova-api-db-create-cqfhq\" (UID: \"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6\") " pod="openstack/nova-api-db-create-cqfhq" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.074474 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-operator-scripts\") pod \"nova-cell0-db-create-hbt9r\" (UID: \"b1d06337-fba1-4b9c-abbc-02f635fd3bdd\") " pod="openstack/nova-cell0-db-create-hbt9r" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.073728 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d9dc-account-create-update-p9wrt" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.100443 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-operator-scripts\") pod \"nova-api-db-create-cqfhq\" (UID: \"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6\") " pod="openstack/nova-api-db-create-cqfhq" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.102015 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.124453 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d9dc-account-create-update-p9wrt"] Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.165827 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlp77\" (UniqueName: \"kubernetes.io/projected/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-kube-api-access-wlp77\") pod \"nova-api-db-create-cqfhq\" (UID: \"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6\") " pod="openstack/nova-api-db-create-cqfhq" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.180984 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3129fa03-2686-49af-a434-341b19fb6661-operator-scripts\") pod \"nova-api-d9dc-account-create-update-p9wrt\" (UID: \"3129fa03-2686-49af-a434-341b19fb6661\") " pod="openstack/nova-api-d9dc-account-create-update-p9wrt" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.181346 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gctfv\" (UniqueName: \"kubernetes.io/projected/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-kube-api-access-gctfv\") pod \"nova-cell0-db-create-hbt9r\" (UID: \"b1d06337-fba1-4b9c-abbc-02f635fd3bdd\") " pod="openstack/nova-cell0-db-create-hbt9r" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.181719 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-operator-scripts\") pod \"nova-cell0-db-create-hbt9r\" (UID: \"b1d06337-fba1-4b9c-abbc-02f635fd3bdd\") " pod="openstack/nova-cell0-db-create-hbt9r" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.181891 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r42c\" (UniqueName: \"kubernetes.io/projected/3129fa03-2686-49af-a434-341b19fb6661-kube-api-access-8r42c\") pod \"nova-api-d9dc-account-create-update-p9wrt\" (UID: \"3129fa03-2686-49af-a434-341b19fb6661\") " pod="openstack/nova-api-d9dc-account-create-update-p9wrt" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.183553 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-operator-scripts\") pod \"nova-cell0-db-create-hbt9r\" (UID: \"b1d06337-fba1-4b9c-abbc-02f635fd3bdd\") " pod="openstack/nova-cell0-db-create-hbt9r" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.183706 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5862l"] Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.185590 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5862l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.222593 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5862l"] Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.236642 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gctfv\" (UniqueName: \"kubernetes.io/projected/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-kube-api-access-gctfv\") pod \"nova-cell0-db-create-hbt9r\" (UID: \"b1d06337-fba1-4b9c-abbc-02f635fd3bdd\") " pod="openstack/nova-cell0-db-create-hbt9r" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.261032 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-00fb-account-create-update-4594l"] Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.262251 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-00fb-account-create-update-4594l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.263570 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hbt9r" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.275484 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.277710 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-00fb-account-create-update-4594l"] Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.295176 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r42c\" (UniqueName: \"kubernetes.io/projected/3129fa03-2686-49af-a434-341b19fb6661-kube-api-access-8r42c\") pod \"nova-api-d9dc-account-create-update-p9wrt\" (UID: \"3129fa03-2686-49af-a434-341b19fb6661\") " pod="openstack/nova-api-d9dc-account-create-update-p9wrt" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.295279 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-operator-scripts\") pod \"nova-cell1-db-create-5862l\" (UID: \"c7623a19-7720-48a2-9a09-7c1d9d1acf3a\") " pod="openstack/nova-cell1-db-create-5862l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.295351 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvr7l\" (UniqueName: \"kubernetes.io/projected/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-kube-api-access-mvr7l\") pod \"nova-cell1-db-create-5862l\" (UID: \"c7623a19-7720-48a2-9a09-7c1d9d1acf3a\") " pod="openstack/nova-cell1-db-create-5862l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.297873 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3129fa03-2686-49af-a434-341b19fb6661-operator-scripts\") pod \"nova-api-d9dc-account-create-update-p9wrt\" (UID: \"3129fa03-2686-49af-a434-341b19fb6661\") " pod="openstack/nova-api-d9dc-account-create-update-p9wrt" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.299199 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3129fa03-2686-49af-a434-341b19fb6661-operator-scripts\") pod \"nova-api-d9dc-account-create-update-p9wrt\" (UID: \"3129fa03-2686-49af-a434-341b19fb6661\") " pod="openstack/nova-api-d9dc-account-create-update-p9wrt" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.323267 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r42c\" (UniqueName: \"kubernetes.io/projected/3129fa03-2686-49af-a434-341b19fb6661-kube-api-access-8r42c\") pod \"nova-api-d9dc-account-create-update-p9wrt\" (UID: \"3129fa03-2686-49af-a434-341b19fb6661\") " pod="openstack/nova-api-d9dc-account-create-update-p9wrt" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.402646 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-operator-scripts\") pod \"nova-cell1-db-create-5862l\" (UID: \"c7623a19-7720-48a2-9a09-7c1d9d1acf3a\") " pod="openstack/nova-cell1-db-create-5862l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.402715 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvr7l\" (UniqueName: \"kubernetes.io/projected/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-kube-api-access-mvr7l\") pod \"nova-cell1-db-create-5862l\" (UID: \"c7623a19-7720-48a2-9a09-7c1d9d1acf3a\") " pod="openstack/nova-cell1-db-create-5862l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.402845 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-operator-scripts\") pod \"nova-cell0-00fb-account-create-update-4594l\" (UID: \"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0\") " pod="openstack/nova-cell0-00fb-account-create-update-4594l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.402874 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8m6d\" (UniqueName: \"kubernetes.io/projected/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-kube-api-access-q8m6d\") pod \"nova-cell0-00fb-account-create-update-4594l\" (UID: \"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0\") " pod="openstack/nova-cell0-00fb-account-create-update-4594l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.403743 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-operator-scripts\") pod \"nova-cell1-db-create-5862l\" (UID: \"c7623a19-7720-48a2-9a09-7c1d9d1acf3a\") " pod="openstack/nova-cell1-db-create-5862l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.411051 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d9dc-account-create-update-p9wrt" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.432862 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvr7l\" (UniqueName: \"kubernetes.io/projected/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-kube-api-access-mvr7l\") pod \"nova-cell1-db-create-5862l\" (UID: \"c7623a19-7720-48a2-9a09-7c1d9d1acf3a\") " pod="openstack/nova-cell1-db-create-5862l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.451784 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cqfhq" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.451952 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-83cd-account-create-update-9h25q"] Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.455800 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-83cd-account-create-update-9h25q" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.458533 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.467460 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-83cd-account-create-update-9h25q"] Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.505388 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-operator-scripts\") pod \"nova-cell0-00fb-account-create-update-4594l\" (UID: \"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0\") " pod="openstack/nova-cell0-00fb-account-create-update-4594l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.505441 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8m6d\" (UniqueName: \"kubernetes.io/projected/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-kube-api-access-q8m6d\") pod \"nova-cell0-00fb-account-create-update-4594l\" (UID: \"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0\") " pod="openstack/nova-cell0-00fb-account-create-update-4594l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.506665 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-operator-scripts\") pod \"nova-cell0-00fb-account-create-update-4594l\" (UID: \"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0\") " pod="openstack/nova-cell0-00fb-account-create-update-4594l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.550722 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8m6d\" (UniqueName: \"kubernetes.io/projected/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-kube-api-access-q8m6d\") pod \"nova-cell0-00fb-account-create-update-4594l\" (UID: \"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0\") " pod="openstack/nova-cell0-00fb-account-create-update-4594l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.596208 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.607959 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzz7z\" (UniqueName: \"kubernetes.io/projected/79bae2a9-56d6-4292-b84b-c346934e5e08-kube-api-access-tzz7z\") pod \"nova-cell1-83cd-account-create-update-9h25q\" (UID: \"79bae2a9-56d6-4292-b84b-c346934e5e08\") " pod="openstack/nova-cell1-83cd-account-create-update-9h25q" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.608177 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79bae2a9-56d6-4292-b84b-c346934e5e08-operator-scripts\") pod \"nova-cell1-83cd-account-create-update-9h25q\" (UID: \"79bae2a9-56d6-4292-b84b-c346934e5e08\") " pod="openstack/nova-cell1-83cd-account-create-update-9h25q" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.609928 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5862l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.620019 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-00fb-account-create-update-4594l" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.709398 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79bae2a9-56d6-4292-b84b-c346934e5e08-operator-scripts\") pod \"nova-cell1-83cd-account-create-update-9h25q\" (UID: \"79bae2a9-56d6-4292-b84b-c346934e5e08\") " pod="openstack/nova-cell1-83cd-account-create-update-9h25q" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.709513 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzz7z\" (UniqueName: \"kubernetes.io/projected/79bae2a9-56d6-4292-b84b-c346934e5e08-kube-api-access-tzz7z\") pod \"nova-cell1-83cd-account-create-update-9h25q\" (UID: \"79bae2a9-56d6-4292-b84b-c346934e5e08\") " pod="openstack/nova-cell1-83cd-account-create-update-9h25q" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.710364 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79bae2a9-56d6-4292-b84b-c346934e5e08-operator-scripts\") pod \"nova-cell1-83cd-account-create-update-9h25q\" (UID: \"79bae2a9-56d6-4292-b84b-c346934e5e08\") " pod="openstack/nova-cell1-83cd-account-create-update-9h25q" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.737656 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzz7z\" (UniqueName: \"kubernetes.io/projected/79bae2a9-56d6-4292-b84b-c346934e5e08-kube-api-access-tzz7z\") pod \"nova-cell1-83cd-account-create-update-9h25q\" (UID: \"79bae2a9-56d6-4292-b84b-c346934e5e08\") " pod="openstack/nova-cell1-83cd-account-create-update-9h25q" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.815158 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-83cd-account-create-update-9h25q" Feb 19 10:03:40 crc kubenswrapper[4873]: I0219 10:03:40.843755 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-87df9b646-2jf26" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Feb 19 10:03:42 crc kubenswrapper[4873]: I0219 10:03:42.003966 4873 generic.go:334] "Generic (PLEG): container finished" podID="3e2e96b4-be71-4257-a1ed-0c7427ed0e64" containerID="c2fc7030796f36afab7f9dbbf523310f26e22db5ef1e487e914036c0ac971b2a" exitCode=0 Feb 19 10:03:42 crc kubenswrapper[4873]: I0219 10:03:42.004011 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749b6895f6-pmvtl" event={"ID":"3e2e96b4-be71-4257-a1ed-0c7427ed0e64","Type":"ContainerDied","Data":"c2fc7030796f36afab7f9dbbf523310f26e22db5ef1e487e914036c0ac971b2a"} Feb 19 10:03:42 crc kubenswrapper[4873]: I0219 10:03:42.484833 4873 scope.go:117] "RemoveContainer" containerID="cc54b787c703c958a190db022b86cf50c377c895c0e9b21773e78b4356509d96" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.551201 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.604858 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-log-httpd\") pod \"437f5e56-e7c4-4280-9f75-2cf9e2496375\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.605308 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxxwz\" (UniqueName: \"kubernetes.io/projected/437f5e56-e7c4-4280-9f75-2cf9e2496375-kube-api-access-sxxwz\") pod \"437f5e56-e7c4-4280-9f75-2cf9e2496375\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.605355 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-scripts\") pod \"437f5e56-e7c4-4280-9f75-2cf9e2496375\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.605386 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-sg-core-conf-yaml\") pod \"437f5e56-e7c4-4280-9f75-2cf9e2496375\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.605440 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-combined-ca-bundle\") pod \"437f5e56-e7c4-4280-9f75-2cf9e2496375\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.605475 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-config-data\") pod \"437f5e56-e7c4-4280-9f75-2cf9e2496375\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.605551 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-run-httpd\") pod \"437f5e56-e7c4-4280-9f75-2cf9e2496375\" (UID: \"437f5e56-e7c4-4280-9f75-2cf9e2496375\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.608352 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "437f5e56-e7c4-4280-9f75-2cf9e2496375" (UID: "437f5e56-e7c4-4280-9f75-2cf9e2496375"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.609519 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "437f5e56-e7c4-4280-9f75-2cf9e2496375" (UID: "437f5e56-e7c4-4280-9f75-2cf9e2496375"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.635495 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437f5e56-e7c4-4280-9f75-2cf9e2496375-kube-api-access-sxxwz" (OuterVolumeSpecName: "kube-api-access-sxxwz") pod "437f5e56-e7c4-4280-9f75-2cf9e2496375" (UID: "437f5e56-e7c4-4280-9f75-2cf9e2496375"). InnerVolumeSpecName "kube-api-access-sxxwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.649356 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-scripts" (OuterVolumeSpecName: "scripts") pod "437f5e56-e7c4-4280-9f75-2cf9e2496375" (UID: "437f5e56-e7c4-4280-9f75-2cf9e2496375"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.710287 4873 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.710324 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxxwz\" (UniqueName: \"kubernetes.io/projected/437f5e56-e7c4-4280-9f75-2cf9e2496375-kube-api-access-sxxwz\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.710335 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.710343 4873 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/437f5e56-e7c4-4280-9f75-2cf9e2496375-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.714756 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "437f5e56-e7c4-4280-9f75-2cf9e2496375" (UID: "437f5e56-e7c4-4280-9f75-2cf9e2496375"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.783655 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.786624 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "437f5e56-e7c4-4280-9f75-2cf9e2496375" (UID: "437f5e56-e7c4-4280-9f75-2cf9e2496375"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.820248 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkk7t\" (UniqueName: \"kubernetes.io/projected/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-kube-api-access-zkk7t\") pod \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.820302 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-httpd-config\") pod \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.820332 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-ovndb-tls-certs\") pod \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.820394 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-config\") pod \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.821010 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-combined-ca-bundle\") pod \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\" (UID: \"3e2e96b4-be71-4257-a1ed-0c7427ed0e64\") " Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.822596 4873 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.822613 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.831643 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3e2e96b4-be71-4257-a1ed-0c7427ed0e64" (UID: "3e2e96b4-be71-4257-a1ed-0c7427ed0e64"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.832174 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-kube-api-access-zkk7t" (OuterVolumeSpecName: "kube-api-access-zkk7t") pod "3e2e96b4-be71-4257-a1ed-0c7427ed0e64" (UID: "3e2e96b4-be71-4257-a1ed-0c7427ed0e64"). InnerVolumeSpecName "kube-api-access-zkk7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.909714 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e2e96b4-be71-4257-a1ed-0c7427ed0e64" (UID: "3e2e96b4-be71-4257-a1ed-0c7427ed0e64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.914227 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-config-data" (OuterVolumeSpecName: "config-data") pod "437f5e56-e7c4-4280-9f75-2cf9e2496375" (UID: "437f5e56-e7c4-4280-9f75-2cf9e2496375"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.924959 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkk7t\" (UniqueName: \"kubernetes.io/projected/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-kube-api-access-zkk7t\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.925290 4873 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.925388 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.925487 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/437f5e56-e7c4-4280-9f75-2cf9e2496375-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.934313 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-config" (OuterVolumeSpecName: "config") pod "3e2e96b4-be71-4257-a1ed-0c7427ed0e64" (UID: "3e2e96b4-be71-4257-a1ed-0c7427ed0e64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:43 crc kubenswrapper[4873]: I0219 10:03:43.973635 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3e2e96b4-be71-4257-a1ed-0c7427ed0e64" (UID: "3e2e96b4-be71-4257-a1ed-0c7427ed0e64"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.028998 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"ab7f7779-d6dd-4844-8af5-83ade972d9d0","Type":"ContainerStarted","Data":"747cd165cbbee3ecae96cc8c9648bca4b8f233bd477999cf09756be76185ea16"} Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.033516 4873 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.033540 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e2e96b4-be71-4257-a1ed-0c7427ed0e64-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.045472 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.045478 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"437f5e56-e7c4-4280-9f75-2cf9e2496375","Type":"ContainerDied","Data":"a72128a70548416ae211c60013a87319728fca02cd7888fa60778dec8ba63ea4"} Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.045531 4873 scope.go:117] "RemoveContainer" containerID="7e5158670e3a62976761b268e07eaed28eb3274621570c9d17f2e4ba96a59299" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.048692 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5c4eb2b5-d272-49ff-938e-3e3359d29f46","Type":"ContainerStarted","Data":"cdb538c387e4bf3b5eec3475b8fd03ad3756c6e3d42dfe7494494ac40869cd29"} Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.051664 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749b6895f6-pmvtl" event={"ID":"3e2e96b4-be71-4257-a1ed-0c7427ed0e64","Type":"ContainerDied","Data":"936d413c07d2e70cda379bc1d9e56c3d69a0e75d48e2a897c8fb38cdf7c08e5e"} Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.051747 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-749b6895f6-pmvtl" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.165428 4873 scope.go:117] "RemoveContainer" containerID="ea2c29e9b6a6ae2111af938dab80bf0fe86bf1c95d1132e043df7ce04e75e6e4" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.178189 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.7437920829999998 podStartE2EDuration="16.178170369s" podCreationTimestamp="2026-02-19 10:03:28 +0000 UTC" firstStartedPulling="2026-02-19 10:03:29.853229831 +0000 UTC m=+1119.142661469" lastFinishedPulling="2026-02-19 10:03:43.287608107 +0000 UTC m=+1132.577039755" observedRunningTime="2026-02-19 10:03:44.093822159 +0000 UTC m=+1133.383253817" watchObservedRunningTime="2026-02-19 10:03:44.178170369 +0000 UTC m=+1133.467602007" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.180143 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-749b6895f6-pmvtl"] Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.194834 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-749b6895f6-pmvtl"] Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.203289 4873 scope.go:117] "RemoveContainer" containerID="3858e9a302b9c6afbe14985a270c0ff8aac1476a89852d70300dc30e43cfb9a9" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.207209 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.229174 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.232258 4873 scope.go:117] "RemoveContainer" containerID="bde159d1e713bedb9f809d72fb731177094209d5eae3fafbca00c3bf92dd0edf" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.240243 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:44 crc kubenswrapper[4873]: E0219 10:03:44.240771 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="ceilometer-central-agent" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.240789 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="ceilometer-central-agent" Feb 19 10:03:44 crc kubenswrapper[4873]: E0219 10:03:44.240804 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="proxy-httpd" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.240812 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="proxy-httpd" Feb 19 10:03:44 crc kubenswrapper[4873]: E0219 10:03:44.240828 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2e96b4-be71-4257-a1ed-0c7427ed0e64" containerName="neutron-httpd" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.240836 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2e96b4-be71-4257-a1ed-0c7427ed0e64" containerName="neutron-httpd" Feb 19 10:03:44 crc kubenswrapper[4873]: E0219 10:03:44.240850 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="sg-core" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.240857 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="sg-core" Feb 19 10:03:44 crc kubenswrapper[4873]: E0219 10:03:44.240888 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2e96b4-be71-4257-a1ed-0c7427ed0e64" containerName="neutron-api" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.240895 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2e96b4-be71-4257-a1ed-0c7427ed0e64" containerName="neutron-api" Feb 19 10:03:44 crc kubenswrapper[4873]: E0219 10:03:44.240909 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="ceilometer-notification-agent" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.240917 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="ceilometer-notification-agent" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.241135 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2e96b4-be71-4257-a1ed-0c7427ed0e64" containerName="neutron-httpd" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.241153 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="sg-core" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.241163 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="proxy-httpd" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.241174 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2e96b4-be71-4257-a1ed-0c7427ed0e64" containerName="neutron-api" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.241180 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="ceilometer-central-agent" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.241197 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" containerName="ceilometer-notification-agent" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.243993 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.251568 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.256737 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.256769 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.269018 4873 scope.go:117] "RemoveContainer" containerID="a79fc5280241e2b1290e7c529d75e24bfc2e34924a4a60f5635bd08c8d066317" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.323053 4873 scope.go:117] "RemoveContainer" containerID="c2fc7030796f36afab7f9dbbf523310f26e22db5ef1e487e914036c0ac971b2a" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.339832 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-scripts\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.339923 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.339963 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.340077 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6nsr\" (UniqueName: \"kubernetes.io/projected/57a06649-fa26-4970-90c9-23271a1471a5-kube-api-access-g6nsr\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.340146 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-config-data\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.340233 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-log-httpd\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.340283 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-run-httpd\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.346799 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cqfhq"] Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.371268 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-83cd-account-create-update-9h25q"] Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.442133 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6nsr\" (UniqueName: \"kubernetes.io/projected/57a06649-fa26-4970-90c9-23271a1471a5-kube-api-access-g6nsr\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.442198 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-config-data\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.442263 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-log-httpd\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.442303 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-run-httpd\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.442324 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-scripts\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.442360 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.442383 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.445989 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-run-httpd\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.448782 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-log-httpd\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.450768 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.451320 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-scripts\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.463011 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.464534 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-config-data\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.475343 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6nsr\" (UniqueName: \"kubernetes.io/projected/57a06649-fa26-4970-90c9-23271a1471a5-kube-api-access-g6nsr\") pod \"ceilometer-0\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.485642 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7c6d694569-qbpxm"] Feb 19 10:03:44 crc kubenswrapper[4873]: W0219 10:03:44.493175 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd51beb70_e455_4e75_9e06_863b41fbf9a8.slice/crio-30a439a6bb41129640b590b3b0580fc88086492e7a41d0d8fdaff456a8c21e64 WatchSource:0}: Error finding container 30a439a6bb41129640b590b3b0580fc88086492e7a41d0d8fdaff456a8c21e64: Status 404 returned error can't find the container with id 30a439a6bb41129640b590b3b0580fc88086492e7a41d0d8fdaff456a8c21e64 Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.584164 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-00fb-account-create-update-4594l"] Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.584285 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.598063 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hbt9r"] Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.649094 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d9dc-account-create-update-p9wrt"] Feb 19 10:03:44 crc kubenswrapper[4873]: I0219 10:03:44.664522 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5862l"] Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.073786 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7c6d694569-qbpxm" event={"ID":"d51beb70-e455-4e75-9e06-863b41fbf9a8","Type":"ContainerStarted","Data":"45a3b4a3ee6b7a836225d08c195c006061b0cf7e34c91e5b09b7e0fd8b04caad"} Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.074180 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7c6d694569-qbpxm" event={"ID":"d51beb70-e455-4e75-9e06-863b41fbf9a8","Type":"ContainerStarted","Data":"30a439a6bb41129640b590b3b0580fc88086492e7a41d0d8fdaff456a8c21e64"} Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.078967 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cqfhq" event={"ID":"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6","Type":"ContainerStarted","Data":"bd733b3ac9bfee3b4bbee18624c05c5bec301e1fa09008f9d0b7376ff957c31a"} Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.079016 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cqfhq" event={"ID":"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6","Type":"ContainerStarted","Data":"22f8a0eea83050aa51d42973c19c5beb0e4d9dd72c9f715f981a45eaa0d280da"} Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.081146 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hbt9r" event={"ID":"b1d06337-fba1-4b9c-abbc-02f635fd3bdd","Type":"ContainerStarted","Data":"e8d94d086cbc8186ccd052540733a5a349014300ab2753d49e5a2c1a63f70e41"} Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.083795 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d9dc-account-create-update-p9wrt" event={"ID":"3129fa03-2686-49af-a434-341b19fb6661","Type":"ContainerStarted","Data":"3eb25a41aa1ebe3aef914232f8db296d6511713d93dc4071313bddb79d55fb80"} Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.085060 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-83cd-account-create-update-9h25q" event={"ID":"79bae2a9-56d6-4292-b84b-c346934e5e08","Type":"ContainerStarted","Data":"4b38c8bb0d66f0eab5c9674afb8862e41ff03592dbf71798d72618e652e32219"} Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.085091 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-83cd-account-create-update-9h25q" event={"ID":"79bae2a9-56d6-4292-b84b-c346934e5e08","Type":"ContainerStarted","Data":"65a7fa1ee7b5178acce98eedba35f101b183e27405952b28bd1e62beeb042654"} Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.088931 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5862l" event={"ID":"c7623a19-7720-48a2-9a09-7c1d9d1acf3a","Type":"ContainerStarted","Data":"429f383cce180b32b32f77705894926ca78081e4e70b072956064f0bd6f1a12e"} Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.090961 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-00fb-account-create-update-4594l" event={"ID":"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0","Type":"ContainerStarted","Data":"2c40ec07bf5b6c41e9dacb848da00df9379b151069b0e2d66aa2102a14ad638c"} Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.101701 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-cqfhq" podStartSLOduration=6.101680914 podStartE2EDuration="6.101680914s" podCreationTimestamp="2026-02-19 10:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:45.096019103 +0000 UTC m=+1134.385450741" watchObservedRunningTime="2026-02-19 10:03:45.101680914 +0000 UTC m=+1134.391112552" Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.117381 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-83cd-account-create-update-9h25q" podStartSLOduration=5.117358566 podStartE2EDuration="5.117358566s" podCreationTimestamp="2026-02-19 10:03:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:45.113360446 +0000 UTC m=+1134.402792084" watchObservedRunningTime="2026-02-19 10:03:45.117358566 +0000 UTC m=+1134.406790204" Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.289823 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.501021 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2e96b4-be71-4257-a1ed-0c7427ed0e64" path="/var/lib/kubelet/pods/3e2e96b4-be71-4257-a1ed-0c7427ed0e64/volumes" Feb 19 10:03:45 crc kubenswrapper[4873]: I0219 10:03:45.501674 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437f5e56-e7c4-4280-9f75-2cf9e2496375" path="/var/lib/kubelet/pods/437f5e56-e7c4-4280-9f75-2cf9e2496375/volumes" Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.106211 4873 generic.go:334] "Generic (PLEG): container finished" podID="3129fa03-2686-49af-a434-341b19fb6661" containerID="e32aae1cb5da5f588b5186b7220b1239b5386c9e999d9330ceeb577323a9711c" exitCode=0 Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.106685 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d9dc-account-create-update-p9wrt" event={"ID":"3129fa03-2686-49af-a434-341b19fb6661","Type":"ContainerDied","Data":"e32aae1cb5da5f588b5186b7220b1239b5386c9e999d9330ceeb577323a9711c"} Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.123922 4873 generic.go:334] "Generic (PLEG): container finished" podID="bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0" containerID="9a04cebf97180c8ea6d0724c6fe0c31aa2fbc8062f300b3608d26c13788862d9" exitCode=0 Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.124027 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-00fb-account-create-update-4594l" event={"ID":"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0","Type":"ContainerDied","Data":"9a04cebf97180c8ea6d0724c6fe0c31aa2fbc8062f300b3608d26c13788862d9"} Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.129861 4873 generic.go:334] "Generic (PLEG): container finished" podID="79bae2a9-56d6-4292-b84b-c346934e5e08" containerID="4b38c8bb0d66f0eab5c9674afb8862e41ff03592dbf71798d72618e652e32219" exitCode=0 Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.129919 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-83cd-account-create-update-9h25q" event={"ID":"79bae2a9-56d6-4292-b84b-c346934e5e08","Type":"ContainerDied","Data":"4b38c8bb0d66f0eab5c9674afb8862e41ff03592dbf71798d72618e652e32219"} Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.133286 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7c6d694569-qbpxm" event={"ID":"d51beb70-e455-4e75-9e06-863b41fbf9a8","Type":"ContainerStarted","Data":"7c8ef61c6a061d0ef8448c266c728f9f150e0c74e880a0ed8a4a67e766db6b55"} Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.133882 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.134375 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.137574 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57a06649-fa26-4970-90c9-23271a1471a5","Type":"ContainerStarted","Data":"b9591c30a9a01ef428ff823d1f169650f4d7ce1f33aa25a91037d30f706c516b"} Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.137619 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57a06649-fa26-4970-90c9-23271a1471a5","Type":"ContainerStarted","Data":"6bdc124061692b5087d0caebc56c02311827bc3d5ab68485c02dae362189e383"} Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.156848 4873 generic.go:334] "Generic (PLEG): container finished" podID="3c26aa2d-a8f4-4645-a1b6-055cb88e64d6" containerID="bd733b3ac9bfee3b4bbee18624c05c5bec301e1fa09008f9d0b7376ff957c31a" exitCode=0 Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.156952 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cqfhq" event={"ID":"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6","Type":"ContainerDied","Data":"bd733b3ac9bfee3b4bbee18624c05c5bec301e1fa09008f9d0b7376ff957c31a"} Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.159032 4873 generic.go:334] "Generic (PLEG): container finished" podID="b1d06337-fba1-4b9c-abbc-02f635fd3bdd" containerID="dd8d0b4c8e6c8fa16639b3273dca3bab2c82aa1c797c85d4fed1f4b2808775ab" exitCode=0 Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.159223 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hbt9r" event={"ID":"b1d06337-fba1-4b9c-abbc-02f635fd3bdd","Type":"ContainerDied","Data":"dd8d0b4c8e6c8fa16639b3273dca3bab2c82aa1c797c85d4fed1f4b2808775ab"} Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.177878 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7c6d694569-qbpxm" podStartSLOduration=10.177861607 podStartE2EDuration="10.177861607s" podCreationTimestamp="2026-02-19 10:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:46.161349995 +0000 UTC m=+1135.450781633" watchObservedRunningTime="2026-02-19 10:03:46.177861607 +0000 UTC m=+1135.467293245" Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.179623 4873 generic.go:334] "Generic (PLEG): container finished" podID="c7623a19-7720-48a2-9a09-7c1d9d1acf3a" containerID="4f7932193028af20a89fc4d6ec905cbeaeae8f2a0c2eccdd691dcdae0d83a150" exitCode=0 Feb 19 10:03:46 crc kubenswrapper[4873]: I0219 10:03:46.179691 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5862l" event={"ID":"c7623a19-7720-48a2-9a09-7c1d9d1acf3a","Type":"ContainerDied","Data":"4f7932193028af20a89fc4d6ec905cbeaeae8f2a0c2eccdd691dcdae0d83a150"} Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.135043 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.171517 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.193274 4873 generic.go:334] "Generic (PLEG): container finished" podID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerID="1a1b6f4ba694daddb17f029a0bbce06c79e8294e69f096dade9d91ac98c03f81" exitCode=137 Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.193392 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87df9b646-2jf26" event={"ID":"cace1157-1459-4823-aa8f-b2c246d3adeb","Type":"ContainerDied","Data":"1a1b6f4ba694daddb17f029a0bbce06c79e8294e69f096dade9d91ac98c03f81"} Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.196317 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57a06649-fa26-4970-90c9-23271a1471a5","Type":"ContainerStarted","Data":"285d8f5f3df00508865f8d3bde2dfe35df29620af39cdf5820cccfea23f27068"} Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.196633 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.246648 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.329971 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.704651 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-83cd-account-create-update-9h25q" Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.867069 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79bae2a9-56d6-4292-b84b-c346934e5e08-operator-scripts\") pod \"79bae2a9-56d6-4292-b84b-c346934e5e08\" (UID: \"79bae2a9-56d6-4292-b84b-c346934e5e08\") " Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.867209 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzz7z\" (UniqueName: \"kubernetes.io/projected/79bae2a9-56d6-4292-b84b-c346934e5e08-kube-api-access-tzz7z\") pod \"79bae2a9-56d6-4292-b84b-c346934e5e08\" (UID: \"79bae2a9-56d6-4292-b84b-c346934e5e08\") " Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.868909 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79bae2a9-56d6-4292-b84b-c346934e5e08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "79bae2a9-56d6-4292-b84b-c346934e5e08" (UID: "79bae2a9-56d6-4292-b84b-c346934e5e08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.886587 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79bae2a9-56d6-4292-b84b-c346934e5e08-kube-api-access-tzz7z" (OuterVolumeSpecName: "kube-api-access-tzz7z") pod "79bae2a9-56d6-4292-b84b-c346934e5e08" (UID: "79bae2a9-56d6-4292-b84b-c346934e5e08"). InnerVolumeSpecName "kube-api-access-tzz7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.970807 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79bae2a9-56d6-4292-b84b-c346934e5e08-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:47 crc kubenswrapper[4873]: I0219 10:03:47.970838 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzz7z\" (UniqueName: \"kubernetes.io/projected/79bae2a9-56d6-4292-b84b-c346934e5e08-kube-api-access-tzz7z\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.166535 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d9dc-account-create-update-p9wrt" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.189504 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cqfhq" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.192451 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5862l" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.211654 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-00fb-account-create-update-4594l" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.218508 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5862l" event={"ID":"c7623a19-7720-48a2-9a09-7c1d9d1acf3a","Type":"ContainerDied","Data":"429f383cce180b32b32f77705894926ca78081e4e70b072956064f0bd6f1a12e"} Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.218546 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="429f383cce180b32b32f77705894926ca78081e4e70b072956064f0bd6f1a12e" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.218612 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5862l" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.233527 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d9dc-account-create-update-p9wrt" event={"ID":"3129fa03-2686-49af-a434-341b19fb6661","Type":"ContainerDied","Data":"3eb25a41aa1ebe3aef914232f8db296d6511713d93dc4071313bddb79d55fb80"} Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.233559 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eb25a41aa1ebe3aef914232f8db296d6511713d93dc4071313bddb79d55fb80" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.233659 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d9dc-account-create-update-p9wrt" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.240477 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.240531 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.245422 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hbt9r" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.246746 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87df9b646-2jf26" event={"ID":"cace1157-1459-4823-aa8f-b2c246d3adeb","Type":"ContainerDied","Data":"9b51de3389b17ee12d7f59af44e9eef14565d045f0ca62918a72e7a072d8c72e"} Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.246780 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b51de3389b17ee12d7f59af44e9eef14565d045f0ca62918a72e7a072d8c72e" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.247758 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.248181 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hbt9r" event={"ID":"b1d06337-fba1-4b9c-abbc-02f635fd3bdd","Type":"ContainerDied","Data":"e8d94d086cbc8186ccd052540733a5a349014300ab2753d49e5a2c1a63f70e41"} Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.248197 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8d94d086cbc8186ccd052540733a5a349014300ab2753d49e5a2c1a63f70e41" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.248236 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hbt9r" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.254797 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57a06649-fa26-4970-90c9-23271a1471a5","Type":"ContainerStarted","Data":"1c794715d03a80d14cb4c1609cba1fa9ac955e06ef46f83a91a22fdc84e32a39"} Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.259917 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cqfhq" event={"ID":"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6","Type":"ContainerDied","Data":"22f8a0eea83050aa51d42973c19c5beb0e4d9dd72c9f715f981a45eaa0d280da"} Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.259945 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22f8a0eea83050aa51d42973c19c5beb0e4d9dd72c9f715f981a45eaa0d280da" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.259996 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cqfhq" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.262961 4873 generic.go:334] "Generic (PLEG): container finished" podID="8786cefd-adc3-4acf-bc04-066bc0510131" containerID="d65db798423e51c2a9f8d6a3012c9ccc857964209249f5fc3748a28883833968" exitCode=137 Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.263015 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8786cefd-adc3-4acf-bc04-066bc0510131","Type":"ContainerDied","Data":"d65db798423e51c2a9f8d6a3012c9ccc857964209249f5fc3748a28883833968"} Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.268484 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-00fb-account-create-update-4594l" event={"ID":"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0","Type":"ContainerDied","Data":"2c40ec07bf5b6c41e9dacb848da00df9379b151069b0e2d66aa2102a14ad638c"} Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.268510 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-00fb-account-create-update-4594l" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.268529 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c40ec07bf5b6c41e9dacb848da00df9379b151069b0e2d66aa2102a14ad638c" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.273539 4873 generic.go:334] "Generic (PLEG): container finished" podID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerID="747cd165cbbee3ecae96cc8c9648bca4b8f233bd477999cf09756be76185ea16" exitCode=1 Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.273779 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"ab7f7779-d6dd-4844-8af5-83ade972d9d0","Type":"ContainerDied","Data":"747cd165cbbee3ecae96cc8c9648bca4b8f233bd477999cf09756be76185ea16"} Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.273847 4873 scope.go:117] "RemoveContainer" containerID="cc54b787c703c958a190db022b86cf50c377c895c0e9b21773e78b4356509d96" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.273935 4873 scope.go:117] "RemoveContainer" containerID="747cd165cbbee3ecae96cc8c9648bca4b8f233bd477999cf09756be76185ea16" Feb 19 10:03:48 crc kubenswrapper[4873]: E0219 10:03:48.274371 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(ab7f7779-d6dd-4844-8af5-83ade972d9d0)\"" pod="openstack/watcher-decision-engine-0" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.275708 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-83cd-account-create-update-9h25q" event={"ID":"79bae2a9-56d6-4292-b84b-c346934e5e08","Type":"ContainerDied","Data":"65a7fa1ee7b5178acce98eedba35f101b183e27405952b28bd1e62beeb042654"} Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.275743 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65a7fa1ee7b5178acce98eedba35f101b183e27405952b28bd1e62beeb042654" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.275721 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-83cd-account-create-update-9h25q" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.278906 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8m6d\" (UniqueName: \"kubernetes.io/projected/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-kube-api-access-q8m6d\") pod \"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0\" (UID: \"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.278979 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-operator-scripts\") pod \"c7623a19-7720-48a2-9a09-7c1d9d1acf3a\" (UID: \"c7623a19-7720-48a2-9a09-7c1d9d1acf3a\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.279856 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7623a19-7720-48a2-9a09-7c1d9d1acf3a" (UID: "c7623a19-7720-48a2-9a09-7c1d9d1acf3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.280793 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3129fa03-2686-49af-a434-341b19fb6661-operator-scripts\") pod \"3129fa03-2686-49af-a434-341b19fb6661\" (UID: \"3129fa03-2686-49af-a434-341b19fb6661\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.280821 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-operator-scripts\") pod \"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6\" (UID: \"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.280843 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r42c\" (UniqueName: \"kubernetes.io/projected/3129fa03-2686-49af-a434-341b19fb6661-kube-api-access-8r42c\") pod \"3129fa03-2686-49af-a434-341b19fb6661\" (UID: \"3129fa03-2686-49af-a434-341b19fb6661\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.280892 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-operator-scripts\") pod \"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0\" (UID: \"bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.281034 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvr7l\" (UniqueName: \"kubernetes.io/projected/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-kube-api-access-mvr7l\") pod \"c7623a19-7720-48a2-9a09-7c1d9d1acf3a\" (UID: \"c7623a19-7720-48a2-9a09-7c1d9d1acf3a\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.281244 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlp77\" (UniqueName: \"kubernetes.io/projected/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-kube-api-access-wlp77\") pod \"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6\" (UID: \"3c26aa2d-a8f4-4645-a1b6-055cb88e64d6\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.281861 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3129fa03-2686-49af-a434-341b19fb6661-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3129fa03-2686-49af-a434-341b19fb6661" (UID: "3129fa03-2686-49af-a434-341b19fb6661"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.283269 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0" (UID: "bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.284371 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.284391 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3129fa03-2686-49af-a434-341b19fb6661-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.284403 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.292610 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c26aa2d-a8f4-4645-a1b6-055cb88e64d6" (UID: "3c26aa2d-a8f4-4645-a1b6-055cb88e64d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.300248 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-kube-api-access-mvr7l" (OuterVolumeSpecName: "kube-api-access-mvr7l") pod "c7623a19-7720-48a2-9a09-7c1d9d1acf3a" (UID: "c7623a19-7720-48a2-9a09-7c1d9d1acf3a"). InnerVolumeSpecName "kube-api-access-mvr7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.305447 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3129fa03-2686-49af-a434-341b19fb6661-kube-api-access-8r42c" (OuterVolumeSpecName: "kube-api-access-8r42c") pod "3129fa03-2686-49af-a434-341b19fb6661" (UID: "3129fa03-2686-49af-a434-341b19fb6661"). InnerVolumeSpecName "kube-api-access-8r42c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.305688 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-kube-api-access-wlp77" (OuterVolumeSpecName: "kube-api-access-wlp77") pod "3c26aa2d-a8f4-4645-a1b6-055cb88e64d6" (UID: "3c26aa2d-a8f4-4645-a1b6-055cb88e64d6"). InnerVolumeSpecName "kube-api-access-wlp77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.330668 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-kube-api-access-q8m6d" (OuterVolumeSpecName: "kube-api-access-q8m6d") pod "bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0" (UID: "bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0"). InnerVolumeSpecName "kube-api-access-q8m6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.386009 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-combined-ca-bundle\") pod \"cace1157-1459-4823-aa8f-b2c246d3adeb\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.386158 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-tls-certs\") pod \"cace1157-1459-4823-aa8f-b2c246d3adeb\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.386189 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gctfv\" (UniqueName: \"kubernetes.io/projected/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-kube-api-access-gctfv\") pod \"b1d06337-fba1-4b9c-abbc-02f635fd3bdd\" (UID: \"b1d06337-fba1-4b9c-abbc-02f635fd3bdd\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.386251 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-scripts\") pod \"cace1157-1459-4823-aa8f-b2c246d3adeb\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.386313 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-operator-scripts\") pod \"b1d06337-fba1-4b9c-abbc-02f635fd3bdd\" (UID: \"b1d06337-fba1-4b9c-abbc-02f635fd3bdd\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.386350 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt75m\" (UniqueName: \"kubernetes.io/projected/cace1157-1459-4823-aa8f-b2c246d3adeb-kube-api-access-jt75m\") pod \"cace1157-1459-4823-aa8f-b2c246d3adeb\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.386408 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-config-data\") pod \"cace1157-1459-4823-aa8f-b2c246d3adeb\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.386453 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cace1157-1459-4823-aa8f-b2c246d3adeb-logs\") pod \"cace1157-1459-4823-aa8f-b2c246d3adeb\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.386579 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-secret-key\") pod \"cace1157-1459-4823-aa8f-b2c246d3adeb\" (UID: \"cace1157-1459-4823-aa8f-b2c246d3adeb\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.386820 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1d06337-fba1-4b9c-abbc-02f635fd3bdd" (UID: "b1d06337-fba1-4b9c-abbc-02f635fd3bdd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.387172 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvr7l\" (UniqueName: \"kubernetes.io/projected/c7623a19-7720-48a2-9a09-7c1d9d1acf3a-kube-api-access-mvr7l\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.387197 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlp77\" (UniqueName: \"kubernetes.io/projected/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-kube-api-access-wlp77\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.387209 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8m6d\" (UniqueName: \"kubernetes.io/projected/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0-kube-api-access-q8m6d\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.387223 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.387234 4873 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.387245 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r42c\" (UniqueName: \"kubernetes.io/projected/3129fa03-2686-49af-a434-341b19fb6661-kube-api-access-8r42c\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.388158 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cace1157-1459-4823-aa8f-b2c246d3adeb-logs" (OuterVolumeSpecName: "logs") pod "cace1157-1459-4823-aa8f-b2c246d3adeb" (UID: "cace1157-1459-4823-aa8f-b2c246d3adeb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.391970 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cace1157-1459-4823-aa8f-b2c246d3adeb-kube-api-access-jt75m" (OuterVolumeSpecName: "kube-api-access-jt75m") pod "cace1157-1459-4823-aa8f-b2c246d3adeb" (UID: "cace1157-1459-4823-aa8f-b2c246d3adeb"). InnerVolumeSpecName "kube-api-access-jt75m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.392526 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "cace1157-1459-4823-aa8f-b2c246d3adeb" (UID: "cace1157-1459-4823-aa8f-b2c246d3adeb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.402729 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-kube-api-access-gctfv" (OuterVolumeSpecName: "kube-api-access-gctfv") pod "b1d06337-fba1-4b9c-abbc-02f635fd3bdd" (UID: "b1d06337-fba1-4b9c-abbc-02f635fd3bdd"). InnerVolumeSpecName "kube-api-access-gctfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.424231 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.436477 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-config-data" (OuterVolumeSpecName: "config-data") pod "cace1157-1459-4823-aa8f-b2c246d3adeb" (UID: "cace1157-1459-4823-aa8f-b2c246d3adeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.448906 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-scripts" (OuterVolumeSpecName: "scripts") pod "cace1157-1459-4823-aa8f-b2c246d3adeb" (UID: "cace1157-1459-4823-aa8f-b2c246d3adeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.452504 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cace1157-1459-4823-aa8f-b2c246d3adeb" (UID: "cace1157-1459-4823-aa8f-b2c246d3adeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.478073 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "cace1157-1459-4823-aa8f-b2c246d3adeb" (UID: "cace1157-1459-4823-aa8f-b2c246d3adeb"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.494851 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8786cefd-adc3-4acf-bc04-066bc0510131-logs\") pod \"8786cefd-adc3-4acf-bc04-066bc0510131\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.494978 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data-custom\") pod \"8786cefd-adc3-4acf-bc04-066bc0510131\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495021 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdhrt\" (UniqueName: \"kubernetes.io/projected/8786cefd-adc3-4acf-bc04-066bc0510131-kube-api-access-vdhrt\") pod \"8786cefd-adc3-4acf-bc04-066bc0510131\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495177 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8786cefd-adc3-4acf-bc04-066bc0510131-etc-machine-id\") pod \"8786cefd-adc3-4acf-bc04-066bc0510131\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495200 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-scripts\") pod \"8786cefd-adc3-4acf-bc04-066bc0510131\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495230 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-combined-ca-bundle\") pod \"8786cefd-adc3-4acf-bc04-066bc0510131\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495262 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data\") pod \"8786cefd-adc3-4acf-bc04-066bc0510131\" (UID: \"8786cefd-adc3-4acf-bc04-066bc0510131\") " Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495730 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cace1157-1459-4823-aa8f-b2c246d3adeb-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495746 4873 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495760 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495772 4873 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cace1157-1459-4823-aa8f-b2c246d3adeb-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495783 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gctfv\" (UniqueName: \"kubernetes.io/projected/b1d06337-fba1-4b9c-abbc-02f635fd3bdd-kube-api-access-gctfv\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495796 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495808 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt75m\" (UniqueName: \"kubernetes.io/projected/cace1157-1459-4823-aa8f-b2c246d3adeb-kube-api-access-jt75m\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.495819 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cace1157-1459-4823-aa8f-b2c246d3adeb-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.499508 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8786cefd-adc3-4acf-bc04-066bc0510131-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8786cefd-adc3-4acf-bc04-066bc0510131" (UID: "8786cefd-adc3-4acf-bc04-066bc0510131"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.500591 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-scripts" (OuterVolumeSpecName: "scripts") pod "8786cefd-adc3-4acf-bc04-066bc0510131" (UID: "8786cefd-adc3-4acf-bc04-066bc0510131"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.501327 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8786cefd-adc3-4acf-bc04-066bc0510131" (UID: "8786cefd-adc3-4acf-bc04-066bc0510131"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.501717 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8786cefd-adc3-4acf-bc04-066bc0510131-logs" (OuterVolumeSpecName: "logs") pod "8786cefd-adc3-4acf-bc04-066bc0510131" (UID: "8786cefd-adc3-4acf-bc04-066bc0510131"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.504448 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8786cefd-adc3-4acf-bc04-066bc0510131-kube-api-access-vdhrt" (OuterVolumeSpecName: "kube-api-access-vdhrt") pod "8786cefd-adc3-4acf-bc04-066bc0510131" (UID: "8786cefd-adc3-4acf-bc04-066bc0510131"). InnerVolumeSpecName "kube-api-access-vdhrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.541870 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8786cefd-adc3-4acf-bc04-066bc0510131" (UID: "8786cefd-adc3-4acf-bc04-066bc0510131"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.580607 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data" (OuterVolumeSpecName: "config-data") pod "8786cefd-adc3-4acf-bc04-066bc0510131" (UID: "8786cefd-adc3-4acf-bc04-066bc0510131"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.597592 4873 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8786cefd-adc3-4acf-bc04-066bc0510131-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.597622 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.597631 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.597642 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.597651 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8786cefd-adc3-4acf-bc04-066bc0510131-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.597659 4873 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8786cefd-adc3-4acf-bc04-066bc0510131-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:48 crc kubenswrapper[4873]: I0219 10:03:48.597668 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdhrt\" (UniqueName: \"kubernetes.io/projected/8786cefd-adc3-4acf-bc04-066bc0510131-kube-api-access-vdhrt\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.295407 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8786cefd-adc3-4acf-bc04-066bc0510131","Type":"ContainerDied","Data":"8e464848b3ebc3175565441b190b29e16936e4f1ed928d10cd26c6f756af71c1"} Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.295684 4873 scope.go:117] "RemoveContainer" containerID="d65db798423e51c2a9f8d6a3012c9ccc857964209249f5fc3748a28883833968" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.295825 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.306158 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-87df9b646-2jf26" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.330295 4873 scope.go:117] "RemoveContainer" containerID="f05593f5088b36a20866dde0c189f6365ca5bb5d444303ca92f1e7f75e70ca2f" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.368665 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-87df9b646-2jf26"] Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.382540 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-87df9b646-2jf26"] Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.392796 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.402921 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.425743 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 10:03:49 crc kubenswrapper[4873]: E0219 10:03:49.426183 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7623a19-7720-48a2-9a09-7c1d9d1acf3a" containerName="mariadb-database-create" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426207 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7623a19-7720-48a2-9a09-7c1d9d1acf3a" containerName="mariadb-database-create" Feb 19 10:03:49 crc kubenswrapper[4873]: E0219 10:03:49.426228 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3129fa03-2686-49af-a434-341b19fb6661" containerName="mariadb-account-create-update" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426238 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="3129fa03-2686-49af-a434-341b19fb6661" containerName="mariadb-account-create-update" Feb 19 10:03:49 crc kubenswrapper[4873]: E0219 10:03:49.426248 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426257 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon" Feb 19 10:03:49 crc kubenswrapper[4873]: E0219 10:03:49.426272 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c26aa2d-a8f4-4645-a1b6-055cb88e64d6" containerName="mariadb-database-create" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426280 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c26aa2d-a8f4-4645-a1b6-055cb88e64d6" containerName="mariadb-database-create" Feb 19 10:03:49 crc kubenswrapper[4873]: E0219 10:03:49.426297 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bae2a9-56d6-4292-b84b-c346934e5e08" containerName="mariadb-account-create-update" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426304 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bae2a9-56d6-4292-b84b-c346934e5e08" containerName="mariadb-account-create-update" Feb 19 10:03:49 crc kubenswrapper[4873]: E0219 10:03:49.426321 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon-log" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426328 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon-log" Feb 19 10:03:49 crc kubenswrapper[4873]: E0219 10:03:49.426338 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0" containerName="mariadb-account-create-update" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426346 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0" containerName="mariadb-account-create-update" Feb 19 10:03:49 crc kubenswrapper[4873]: E0219 10:03:49.426363 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d06337-fba1-4b9c-abbc-02f635fd3bdd" containerName="mariadb-database-create" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426371 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d06337-fba1-4b9c-abbc-02f635fd3bdd" containerName="mariadb-database-create" Feb 19 10:03:49 crc kubenswrapper[4873]: E0219 10:03:49.426388 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8786cefd-adc3-4acf-bc04-066bc0510131" containerName="cinder-api" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426396 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8786cefd-adc3-4acf-bc04-066bc0510131" containerName="cinder-api" Feb 19 10:03:49 crc kubenswrapper[4873]: E0219 10:03:49.426407 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8786cefd-adc3-4acf-bc04-066bc0510131" containerName="cinder-api-log" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426414 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8786cefd-adc3-4acf-bc04-066bc0510131" containerName="cinder-api-log" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426627 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="3129fa03-2686-49af-a434-341b19fb6661" containerName="mariadb-account-create-update" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426643 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bae2a9-56d6-4292-b84b-c346934e5e08" containerName="mariadb-account-create-update" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426666 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="8786cefd-adc3-4acf-bc04-066bc0510131" containerName="cinder-api" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426674 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d06337-fba1-4b9c-abbc-02f635fd3bdd" containerName="mariadb-database-create" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426683 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0" containerName="mariadb-account-create-update" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426695 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7623a19-7720-48a2-9a09-7c1d9d1acf3a" containerName="mariadb-database-create" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426703 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426712 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="8786cefd-adc3-4acf-bc04-066bc0510131" containerName="cinder-api-log" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426721 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" containerName="horizon-log" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.426731 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c26aa2d-a8f4-4645-a1b6-055cb88e64d6" containerName="mariadb-database-create" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.427844 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.430437 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.430638 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.430818 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.434311 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.498459 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8786cefd-adc3-4acf-bc04-066bc0510131" path="/var/lib/kubelet/pods/8786cefd-adc3-4acf-bc04-066bc0510131/volumes" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.499121 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cace1157-1459-4823-aa8f-b2c246d3adeb" path="/var/lib/kubelet/pods/cace1157-1459-4823-aa8f-b2c246d3adeb/volumes" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.517996 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-config-data\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.518441 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3dabe51-c676-42bb-936a-d784ee2e565a-logs\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.518476 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.518522 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfrf5\" (UniqueName: \"kubernetes.io/projected/f3dabe51-c676-42bb-936a-d784ee2e565a-kube-api-access-lfrf5\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.518575 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-scripts\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.518662 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.518761 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f3dabe51-c676-42bb-936a-d784ee2e565a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.518788 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.518835 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.629561 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-scripts\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.629639 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.629704 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f3dabe51-c676-42bb-936a-d784ee2e565a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.629722 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.629755 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.629786 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-config-data\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.629838 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3dabe51-c676-42bb-936a-d784ee2e565a-logs\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.629854 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.629901 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfrf5\" (UniqueName: \"kubernetes.io/projected/f3dabe51-c676-42bb-936a-d784ee2e565a-kube-api-access-lfrf5\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.632185 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f3dabe51-c676-42bb-936a-d784ee2e565a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.632438 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3dabe51-c676-42bb-936a-d784ee2e565a-logs\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.634127 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.634481 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.635319 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.635420 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.636090 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-config-data\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.636390 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3dabe51-c676-42bb-936a-d784ee2e565a-scripts\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.648155 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfrf5\" (UniqueName: \"kubernetes.io/projected/f3dabe51-c676-42bb-936a-d784ee2e565a-kube-api-access-lfrf5\") pod \"cinder-api-0\" (UID: \"f3dabe51-c676-42bb-936a-d784ee2e565a\") " pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.724964 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.746483 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.836880 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-combined-ca-bundle\") pod \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.836929 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-config-data\") pod \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.836976 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab7f7779-d6dd-4844-8af5-83ade972d9d0-logs\") pod \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.836994 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-custom-prometheus-ca\") pod \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.837068 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mzrl\" (UniqueName: \"kubernetes.io/projected/ab7f7779-d6dd-4844-8af5-83ade972d9d0-kube-api-access-6mzrl\") pod \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\" (UID: \"ab7f7779-d6dd-4844-8af5-83ade972d9d0\") " Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.838319 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab7f7779-d6dd-4844-8af5-83ade972d9d0-logs" (OuterVolumeSpecName: "logs") pod "ab7f7779-d6dd-4844-8af5-83ade972d9d0" (UID: "ab7f7779-d6dd-4844-8af5-83ade972d9d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.838984 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab7f7779-d6dd-4844-8af5-83ade972d9d0-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.856337 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab7f7779-d6dd-4844-8af5-83ade972d9d0-kube-api-access-6mzrl" (OuterVolumeSpecName: "kube-api-access-6mzrl") pod "ab7f7779-d6dd-4844-8af5-83ade972d9d0" (UID: "ab7f7779-d6dd-4844-8af5-83ade972d9d0"). InnerVolumeSpecName "kube-api-access-6mzrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.872246 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab7f7779-d6dd-4844-8af5-83ade972d9d0" (UID: "ab7f7779-d6dd-4844-8af5-83ade972d9d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.888338 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "ab7f7779-d6dd-4844-8af5-83ade972d9d0" (UID: "ab7f7779-d6dd-4844-8af5-83ade972d9d0"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.914311 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-config-data" (OuterVolumeSpecName: "config-data") pod "ab7f7779-d6dd-4844-8af5-83ade972d9d0" (UID: "ab7f7779-d6dd-4844-8af5-83ade972d9d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.940963 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.941000 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.941010 4873 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ab7f7779-d6dd-4844-8af5-83ade972d9d0-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:49 crc kubenswrapper[4873]: I0219 10:03:49.941019 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mzrl\" (UniqueName: \"kubernetes.io/projected/ab7f7779-d6dd-4844-8af5-83ade972d9d0-kube-api-access-6mzrl\") on node \"crc\" DevicePath \"\"" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.196832 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.321157 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"ab7f7779-d6dd-4844-8af5-83ade972d9d0","Type":"ContainerDied","Data":"6715424b51c6df78b1881817986335974e70067799bdff519c5527858f40bf0f"} Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.321182 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.321205 4873 scope.go:117] "RemoveContainer" containerID="747cd165cbbee3ecae96cc8c9648bca4b8f233bd477999cf09756be76185ea16" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.323811 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f3dabe51-c676-42bb-936a-d784ee2e565a","Type":"ContainerStarted","Data":"b103a88a763865a4dcb1fcf97e0b3dd3c82cdbe60bd19d5f40ed2c98f6cf9e9e"} Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.430435 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.463297 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.475594 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:03:50 crc kubenswrapper[4873]: E0219 10:03:50.476046 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.476067 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: E0219 10:03:50.476083 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.476091 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: E0219 10:03:50.476127 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.476137 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.476372 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.476397 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.476417 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.477245 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.479755 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.487228 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.556613 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecf8671-28f5-4549-a4c1-0cdad8421837-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.556676 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ecf8671-28f5-4549-a4c1-0cdad8421837-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.556706 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55pmh\" (UniqueName: \"kubernetes.io/projected/3ecf8671-28f5-4549-a4c1-0cdad8421837-kube-api-access-55pmh\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.556840 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecf8671-28f5-4549-a4c1-0cdad8421837-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.556879 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ecf8671-28f5-4549-a4c1-0cdad8421837-logs\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.658465 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ecf8671-28f5-4549-a4c1-0cdad8421837-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.658847 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55pmh\" (UniqueName: \"kubernetes.io/projected/3ecf8671-28f5-4549-a4c1-0cdad8421837-kube-api-access-55pmh\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.658984 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecf8671-28f5-4549-a4c1-0cdad8421837-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.659059 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ecf8671-28f5-4549-a4c1-0cdad8421837-logs\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.659242 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecf8671-28f5-4549-a4c1-0cdad8421837-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.661755 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ecf8671-28f5-4549-a4c1-0cdad8421837-logs\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.682269 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3ecf8671-28f5-4549-a4c1-0cdad8421837-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.686145 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecf8671-28f5-4549-a4c1-0cdad8421837-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.691736 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecf8671-28f5-4549-a4c1-0cdad8421837-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.706908 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55pmh\" (UniqueName: \"kubernetes.io/projected/3ecf8671-28f5-4549-a4c1-0cdad8421837-kube-api-access-55pmh\") pod \"watcher-decision-engine-0\" (UID: \"3ecf8671-28f5-4549-a4c1-0cdad8421837\") " pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.742899 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgm8t"] Feb 19 10:03:50 crc kubenswrapper[4873]: E0219 10:03:50.743440 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.743464 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.743723 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" containerName="watcher-decision-engine" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.744535 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.748973 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.751969 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.752159 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-c85mr" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.763486 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgm8t"] Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.798809 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.861820 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.861935 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-config-data\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.861960 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-scripts\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.862059 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d74xc\" (UniqueName: \"kubernetes.io/projected/2f8fe617-c1d5-41f8-a23a-eeb88444f620-kube-api-access-d74xc\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.963778 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-config-data\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.963829 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-scripts\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.963902 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d74xc\" (UniqueName: \"kubernetes.io/projected/2f8fe617-c1d5-41f8-a23a-eeb88444f620-kube-api-access-d74xc\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.964016 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.968593 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-scripts\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.968744 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.968896 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-config-data\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:50 crc kubenswrapper[4873]: I0219 10:03:50.983716 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d74xc\" (UniqueName: \"kubernetes.io/projected/2f8fe617-c1d5-41f8-a23a-eeb88444f620-kube-api-access-d74xc\") pod \"nova-cell0-conductor-db-sync-qgm8t\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:51 crc kubenswrapper[4873]: I0219 10:03:51.079880 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:03:51 crc kubenswrapper[4873]: I0219 10:03:51.350307 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f3dabe51-c676-42bb-936a-d784ee2e565a","Type":"ContainerStarted","Data":"2be39cd056becc984b00fb6bbd9d4f93d9116f3c6319509fcab0eca65233b21c"} Feb 19 10:03:51 crc kubenswrapper[4873]: I0219 10:03:51.514991 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab7f7779-d6dd-4844-8af5-83ade972d9d0" path="/var/lib/kubelet/pods/ab7f7779-d6dd-4844-8af5-83ade972d9d0/volumes" Feb 19 10:03:51 crc kubenswrapper[4873]: I0219 10:03:51.695614 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgm8t"] Feb 19 10:03:51 crc kubenswrapper[4873]: I0219 10:03:51.798346 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:51 crc kubenswrapper[4873]: I0219 10:03:51.799915 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7c6d694569-qbpxm" Feb 19 10:03:51 crc kubenswrapper[4873]: I0219 10:03:51.831704 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Feb 19 10:03:51 crc kubenswrapper[4873]: W0219 10:03:51.837747 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ecf8671_28f5_4549_a4c1_0cdad8421837.slice/crio-c65e7d8b914be6bbb256158d479506d09a117eba3db4bd052d6b9f2bf1507ad5 WatchSource:0}: Error finding container c65e7d8b914be6bbb256158d479506d09a117eba3db4bd052d6b9f2bf1507ad5: Status 404 returned error can't find the container with id c65e7d8b914be6bbb256158d479506d09a117eba3db4bd052d6b9f2bf1507ad5 Feb 19 10:03:52 crc kubenswrapper[4873]: I0219 10:03:52.456933 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3ecf8671-28f5-4549-a4c1-0cdad8421837","Type":"ContainerStarted","Data":"9e5b6951ebca0fcf3c7affa218cccea2ee4fb678a4150c91a23f7158b5922791"} Feb 19 10:03:52 crc kubenswrapper[4873]: I0219 10:03:52.457235 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3ecf8671-28f5-4549-a4c1-0cdad8421837","Type":"ContainerStarted","Data":"c65e7d8b914be6bbb256158d479506d09a117eba3db4bd052d6b9f2bf1507ad5"} Feb 19 10:03:52 crc kubenswrapper[4873]: I0219 10:03:52.463653 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgm8t" event={"ID":"2f8fe617-c1d5-41f8-a23a-eeb88444f620","Type":"ContainerStarted","Data":"5c51d1e42a2baea6c1a9d92c8fcf55ee9de4da189a67e10f9dca665987216a5f"} Feb 19 10:03:52 crc kubenswrapper[4873]: I0219 10:03:52.472293 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f3dabe51-c676-42bb-936a-d784ee2e565a","Type":"ContainerStarted","Data":"c6f0ae409955b29670854741f2c5f5c3af5ede9713476cd73c53562a322da01b"} Feb 19 10:03:52 crc kubenswrapper[4873]: I0219 10:03:52.473249 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 10:03:52 crc kubenswrapper[4873]: I0219 10:03:52.480211 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57a06649-fa26-4970-90c9-23271a1471a5","Type":"ContainerStarted","Data":"00fa225480a93439d5f2b6e9127475ed4c852f38cc7854f8dfea3786315c0e99"} Feb 19 10:03:52 crc kubenswrapper[4873]: I0219 10:03:52.480302 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.4802546469999998 podStartE2EDuration="2.480254647s" podCreationTimestamp="2026-02-19 10:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:52.476029552 +0000 UTC m=+1141.765461190" watchObservedRunningTime="2026-02-19 10:03:52.480254647 +0000 UTC m=+1141.769686295" Feb 19 10:03:52 crc kubenswrapper[4873]: I0219 10:03:52.480359 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 10:03:52 crc kubenswrapper[4873]: I0219 10:03:52.503548 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.503531459 podStartE2EDuration="3.503531459s" podCreationTimestamp="2026-02-19 10:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:03:52.496969395 +0000 UTC m=+1141.786401033" watchObservedRunningTime="2026-02-19 10:03:52.503531459 +0000 UTC m=+1141.792963097" Feb 19 10:03:52 crc kubenswrapper[4873]: I0219 10:03:52.518992 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.709469941 podStartE2EDuration="8.518973125s" podCreationTimestamp="2026-02-19 10:03:44 +0000 UTC" firstStartedPulling="2026-02-19 10:03:45.317372728 +0000 UTC m=+1134.606804366" lastFinishedPulling="2026-02-19 10:03:51.126875912 +0000 UTC m=+1140.416307550" observedRunningTime="2026-02-19 10:03:52.513676943 +0000 UTC m=+1141.803108601" watchObservedRunningTime="2026-02-19 10:03:52.518973125 +0000 UTC m=+1141.808404763" Feb 19 10:03:56 crc kubenswrapper[4873]: I0219 10:03:56.249636 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:03:56 crc kubenswrapper[4873]: I0219 10:03:56.250292 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="ceilometer-central-agent" containerID="cri-o://b9591c30a9a01ef428ff823d1f169650f4d7ce1f33aa25a91037d30f706c516b" gracePeriod=30 Feb 19 10:03:56 crc kubenswrapper[4873]: I0219 10:03:56.250403 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="proxy-httpd" containerID="cri-o://00fa225480a93439d5f2b6e9127475ed4c852f38cc7854f8dfea3786315c0e99" gracePeriod=30 Feb 19 10:03:56 crc kubenswrapper[4873]: I0219 10:03:56.250443 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="sg-core" containerID="cri-o://1c794715d03a80d14cb4c1609cba1fa9ac955e06ef46f83a91a22fdc84e32a39" gracePeriod=30 Feb 19 10:03:56 crc kubenswrapper[4873]: I0219 10:03:56.250470 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="ceilometer-notification-agent" containerID="cri-o://285d8f5f3df00508865f8d3bde2dfe35df29620af39cdf5820cccfea23f27068" gracePeriod=30 Feb 19 10:03:56 crc kubenswrapper[4873]: I0219 10:03:56.528976 4873 generic.go:334] "Generic (PLEG): container finished" podID="57a06649-fa26-4970-90c9-23271a1471a5" containerID="00fa225480a93439d5f2b6e9127475ed4c852f38cc7854f8dfea3786315c0e99" exitCode=0 Feb 19 10:03:56 crc kubenswrapper[4873]: I0219 10:03:56.529224 4873 generic.go:334] "Generic (PLEG): container finished" podID="57a06649-fa26-4970-90c9-23271a1471a5" containerID="1c794715d03a80d14cb4c1609cba1fa9ac955e06ef46f83a91a22fdc84e32a39" exitCode=2 Feb 19 10:03:56 crc kubenswrapper[4873]: I0219 10:03:56.529062 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57a06649-fa26-4970-90c9-23271a1471a5","Type":"ContainerDied","Data":"00fa225480a93439d5f2b6e9127475ed4c852f38cc7854f8dfea3786315c0e99"} Feb 19 10:03:56 crc kubenswrapper[4873]: I0219 10:03:56.529260 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57a06649-fa26-4970-90c9-23271a1471a5","Type":"ContainerDied","Data":"1c794715d03a80d14cb4c1609cba1fa9ac955e06ef46f83a91a22fdc84e32a39"} Feb 19 10:03:57 crc kubenswrapper[4873]: I0219 10:03:57.552033 4873 generic.go:334] "Generic (PLEG): container finished" podID="57a06649-fa26-4970-90c9-23271a1471a5" containerID="285d8f5f3df00508865f8d3bde2dfe35df29620af39cdf5820cccfea23f27068" exitCode=0 Feb 19 10:03:57 crc kubenswrapper[4873]: I0219 10:03:57.552072 4873 generic.go:334] "Generic (PLEG): container finished" podID="57a06649-fa26-4970-90c9-23271a1471a5" containerID="b9591c30a9a01ef428ff823d1f169650f4d7ce1f33aa25a91037d30f706c516b" exitCode=0 Feb 19 10:03:57 crc kubenswrapper[4873]: I0219 10:03:57.552093 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57a06649-fa26-4970-90c9-23271a1471a5","Type":"ContainerDied","Data":"285d8f5f3df00508865f8d3bde2dfe35df29620af39cdf5820cccfea23f27068"} Feb 19 10:03:57 crc kubenswrapper[4873]: I0219 10:03:57.552133 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57a06649-fa26-4970-90c9-23271a1471a5","Type":"ContainerDied","Data":"b9591c30a9a01ef428ff823d1f169650f4d7ce1f33aa25a91037d30f706c516b"} Feb 19 10:04:00 crc kubenswrapper[4873]: I0219 10:04:00.365352 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:04:00 crc kubenswrapper[4873]: I0219 10:04:00.366127 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" containerName="glance-httpd" containerID="cri-o://78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4" gracePeriod=30 Feb 19 10:04:00 crc kubenswrapper[4873]: I0219 10:04:00.366043 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" containerName="glance-log" containerID="cri-o://69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0" gracePeriod=30 Feb 19 10:04:00 crc kubenswrapper[4873]: I0219 10:04:00.593519 4873 generic.go:334] "Generic (PLEG): container finished" podID="0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" containerID="69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0" exitCode=143 Feb 19 10:04:00 crc kubenswrapper[4873]: I0219 10:04:00.593567 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a","Type":"ContainerDied","Data":"69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0"} Feb 19 10:04:00 crc kubenswrapper[4873]: I0219 10:04:00.799406 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Feb 19 10:04:00 crc kubenswrapper[4873]: I0219 10:04:00.851437 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.610758 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.660833 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.899499 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.990660 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-run-httpd\") pod \"57a06649-fa26-4970-90c9-23271a1471a5\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.990789 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-config-data\") pod \"57a06649-fa26-4970-90c9-23271a1471a5\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.990869 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-sg-core-conf-yaml\") pod \"57a06649-fa26-4970-90c9-23271a1471a5\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.990896 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-scripts\") pod \"57a06649-fa26-4970-90c9-23271a1471a5\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.990921 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-log-httpd\") pod \"57a06649-fa26-4970-90c9-23271a1471a5\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.991007 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6nsr\" (UniqueName: \"kubernetes.io/projected/57a06649-fa26-4970-90c9-23271a1471a5-kube-api-access-g6nsr\") pod \"57a06649-fa26-4970-90c9-23271a1471a5\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.991066 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-combined-ca-bundle\") pod \"57a06649-fa26-4970-90c9-23271a1471a5\" (UID: \"57a06649-fa26-4970-90c9-23271a1471a5\") " Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.991151 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "57a06649-fa26-4970-90c9-23271a1471a5" (UID: "57a06649-fa26-4970-90c9-23271a1471a5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.991610 4873 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.991649 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "57a06649-fa26-4970-90c9-23271a1471a5" (UID: "57a06649-fa26-4970-90c9-23271a1471a5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:01 crc kubenswrapper[4873]: I0219 10:04:01.998724 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-scripts" (OuterVolumeSpecName: "scripts") pod "57a06649-fa26-4970-90c9-23271a1471a5" (UID: "57a06649-fa26-4970-90c9-23271a1471a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.017280 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "57a06649-fa26-4970-90c9-23271a1471a5" (UID: "57a06649-fa26-4970-90c9-23271a1471a5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.018863 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a06649-fa26-4970-90c9-23271a1471a5-kube-api-access-g6nsr" (OuterVolumeSpecName: "kube-api-access-g6nsr") pod "57a06649-fa26-4970-90c9-23271a1471a5" (UID: "57a06649-fa26-4970-90c9-23271a1471a5"). InnerVolumeSpecName "kube-api-access-g6nsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.095327 4873 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.095643 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.095654 4873 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57a06649-fa26-4970-90c9-23271a1471a5-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.095664 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6nsr\" (UniqueName: \"kubernetes.io/projected/57a06649-fa26-4970-90c9-23271a1471a5-kube-api-access-g6nsr\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.144356 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-config-data" (OuterVolumeSpecName: "config-data") pod "57a06649-fa26-4970-90c9-23271a1471a5" (UID: "57a06649-fa26-4970-90c9-23271a1471a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.148395 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57a06649-fa26-4970-90c9-23271a1471a5" (UID: "57a06649-fa26-4970-90c9-23271a1471a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.197275 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.197302 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a06649-fa26-4970-90c9-23271a1471a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.360654 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.537696 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.604098 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-config-data\") pod \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.604294 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.604928 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xhjk\" (UniqueName: \"kubernetes.io/projected/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-kube-api-access-9xhjk\") pod \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.605074 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-internal-tls-certs\") pod \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.605194 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-scripts\") pod \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.605274 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-logs\") pod \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.605312 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-httpd-run\") pod \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.605348 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-combined-ca-bundle\") pod \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\" (UID: \"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a\") " Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.606079 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" (UID: "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.606205 4873 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.606932 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-logs" (OuterVolumeSpecName: "logs") pod "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" (UID: "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.610246 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-scripts" (OuterVolumeSpecName: "scripts") pod "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" (UID: "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.615049 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" (UID: "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.621044 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-kube-api-access-9xhjk" (OuterVolumeSpecName: "kube-api-access-9xhjk") pod "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" (UID: "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a"). InnerVolumeSpecName "kube-api-access-9xhjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.646364 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgm8t" event={"ID":"2f8fe617-c1d5-41f8-a23a-eeb88444f620","Type":"ContainerStarted","Data":"e9c86902c9a53b767e99a6b86b96ed298fea1e2244a0785fd8c7eeb7d4f69fa7"} Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.650467 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" (UID: "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.663963 4873 generic.go:334] "Generic (PLEG): container finished" podID="0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" containerID="78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4" exitCode=0 Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.664042 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a","Type":"ContainerDied","Data":"78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4"} Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.664067 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a","Type":"ContainerDied","Data":"27dd31c2ce043db502b2393bca38366353aaef589f4415a46def82efc00bdbd7"} Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.664083 4873 scope.go:117] "RemoveContainer" containerID="78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.664214 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.665675 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qgm8t" podStartSLOduration=2.803989243 podStartE2EDuration="12.665666362s" podCreationTimestamp="2026-02-19 10:03:50 +0000 UTC" firstStartedPulling="2026-02-19 10:03:51.72540554 +0000 UTC m=+1141.014837178" lastFinishedPulling="2026-02-19 10:04:01.587082659 +0000 UTC m=+1150.876514297" observedRunningTime="2026-02-19 10:04:02.663846307 +0000 UTC m=+1151.953277945" watchObservedRunningTime="2026-02-19 10:04:02.665666362 +0000 UTC m=+1151.955098000" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.695425 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.697437 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57a06649-fa26-4970-90c9-23271a1471a5","Type":"ContainerDied","Data":"6bdc124061692b5087d0caebc56c02311827bc3d5ab68485c02dae362189e383"} Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.703241 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-config-data" (OuterVolumeSpecName: "config-data") pod "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" (UID: "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.708348 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.708385 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.708423 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.708436 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.708461 4873 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.708473 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xhjk\" (UniqueName: \"kubernetes.io/projected/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-kube-api-access-9xhjk\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.714302 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" (UID: "0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.736262 4873 scope.go:117] "RemoveContainer" containerID="69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.746654 4873 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.762171 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.810383 4873 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.810409 4873 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.820576 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.845162 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:02 crc kubenswrapper[4873]: E0219 10:04:02.845690 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" containerName="glance-log" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.845708 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" containerName="glance-log" Feb 19 10:04:02 crc kubenswrapper[4873]: E0219 10:04:02.845723 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="sg-core" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.845730 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="sg-core" Feb 19 10:04:02 crc kubenswrapper[4873]: E0219 10:04:02.845750 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" containerName="glance-httpd" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.845758 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" containerName="glance-httpd" Feb 19 10:04:02 crc kubenswrapper[4873]: E0219 10:04:02.845769 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="ceilometer-central-agent" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.845777 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="ceilometer-central-agent" Feb 19 10:04:02 crc kubenswrapper[4873]: E0219 10:04:02.845788 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="proxy-httpd" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.845795 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="proxy-httpd" Feb 19 10:04:02 crc kubenswrapper[4873]: E0219 10:04:02.845810 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="ceilometer-notification-agent" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.845818 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="ceilometer-notification-agent" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.846061 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="proxy-httpd" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.846076 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" containerName="glance-httpd" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.846088 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="sg-core" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.846121 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="ceilometer-central-agent" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.846137 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a06649-fa26-4970-90c9-23271a1471a5" containerName="ceilometer-notification-agent" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.846153 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" containerName="glance-log" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.848317 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.855660 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.862061 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.862227 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.887680 4873 scope.go:117] "RemoveContainer" containerID="78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4" Feb 19 10:04:02 crc kubenswrapper[4873]: E0219 10:04:02.888343 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4\": container with ID starting with 78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4 not found: ID does not exist" containerID="78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.888390 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4"} err="failed to get container status \"78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4\": rpc error: code = NotFound desc = could not find container \"78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4\": container with ID starting with 78be319f5d27debcef5596e89d486ade1ea7d0c8bf4ab5f7c035ba6d936419b4 not found: ID does not exist" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.888421 4873 scope.go:117] "RemoveContainer" containerID="69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0" Feb 19 10:04:02 crc kubenswrapper[4873]: E0219 10:04:02.888772 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0\": container with ID starting with 69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0 not found: ID does not exist" containerID="69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.888799 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0"} err="failed to get container status \"69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0\": rpc error: code = NotFound desc = could not find container \"69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0\": container with ID starting with 69f1aeefee26a2a735764a65c138335e9092bf60086c87c1bdd76e2c3a2719f0 not found: ID does not exist" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.888815 4873 scope.go:117] "RemoveContainer" containerID="00fa225480a93439d5f2b6e9127475ed4c852f38cc7854f8dfea3786315c0e99" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.912505 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.912576 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-scripts\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.912601 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.912655 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-run-httpd\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.912675 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-log-httpd\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.912849 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptdn9\" (UniqueName: \"kubernetes.io/projected/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-kube-api-access-ptdn9\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.912889 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-config-data\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.927281 4873 scope.go:117] "RemoveContainer" containerID="1c794715d03a80d14cb4c1609cba1fa9ac955e06ef46f83a91a22fdc84e32a39" Feb 19 10:04:02 crc kubenswrapper[4873]: I0219 10:04:02.959746 4873 scope.go:117] "RemoveContainer" containerID="285d8f5f3df00508865f8d3bde2dfe35df29620af39cdf5820cccfea23f27068" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.007585 4873 scope.go:117] "RemoveContainer" containerID="b9591c30a9a01ef428ff823d1f169650f4d7ce1f33aa25a91037d30f706c516b" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.021957 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.022032 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-scripts\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.022054 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.022094 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-run-httpd\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.022136 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-log-httpd\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.022248 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptdn9\" (UniqueName: \"kubernetes.io/projected/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-kube-api-access-ptdn9\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.022273 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-config-data\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.022862 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.023779 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-log-httpd\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.024178 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-run-httpd\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.034230 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-config-data\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.036740 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.042897 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-scripts\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.050898 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptdn9\" (UniqueName: \"kubernetes.io/projected/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-kube-api-access-ptdn9\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.055991 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.078190 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.095582 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.099925 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.104563 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.105385 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.112226 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.126171 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.126221 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.126264 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.126282 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.126310 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl6g6\" (UniqueName: \"kubernetes.io/projected/c0df7963-e78f-457c-a27f-45c26232cfa7-kube-api-access-zl6g6\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.126552 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.126621 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0df7963-e78f-457c-a27f-45c26232cfa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.126653 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0df7963-e78f-457c-a27f-45c26232cfa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.179235 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.228390 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0df7963-e78f-457c-a27f-45c26232cfa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.228663 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0df7963-e78f-457c-a27f-45c26232cfa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.228694 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.228722 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.228763 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.228784 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.228811 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl6g6\" (UniqueName: \"kubernetes.io/projected/c0df7963-e78f-457c-a27f-45c26232cfa7-kube-api-access-zl6g6\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.228858 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.228978 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0df7963-e78f-457c-a27f-45c26232cfa7-logs\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.229475 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.229875 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0df7963-e78f-457c-a27f-45c26232cfa7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.242136 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.244741 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.246359 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.246794 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0df7963-e78f-457c-a27f-45c26232cfa7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.252365 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl6g6\" (UniqueName: \"kubernetes.io/projected/c0df7963-e78f-457c-a27f-45c26232cfa7-kube-api-access-zl6g6\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.269452 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0df7963-e78f-457c-a27f-45c26232cfa7\") " pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.451537 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.529647 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a" path="/var/lib/kubelet/pods/0a3b30a9-f42d-4ac8-a0d0-9c03d0071c7a/volumes" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.530407 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a06649-fa26-4970-90c9-23271a1471a5" path="/var/lib/kubelet/pods/57a06649-fa26-4970-90c9-23271a1471a5/volumes" Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.560975 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:03 crc kubenswrapper[4873]: I0219 10:04:03.729255 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8","Type":"ContainerStarted","Data":"ef43583f8104679c840d249a101e9c9f6c6b978a9eee554010eb7c86975dede9"} Feb 19 10:04:04 crc kubenswrapper[4873]: I0219 10:04:04.084976 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 10:04:04 crc kubenswrapper[4873]: W0219 10:04:04.088631 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0df7963_e78f_457c_a27f_45c26232cfa7.slice/crio-1625a014d94b008c2147d0b553e57391c7230229827cf9bb223d147043ac205c WatchSource:0}: Error finding container 1625a014d94b008c2147d0b553e57391c7230229827cf9bb223d147043ac205c: Status 404 returned error can't find the container with id 1625a014d94b008c2147d0b553e57391c7230229827cf9bb223d147043ac205c Feb 19 10:04:04 crc kubenswrapper[4873]: I0219 10:04:04.766686 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c0df7963-e78f-457c-a27f-45c26232cfa7","Type":"ContainerStarted","Data":"8a149e59a052af1d5903a3229e690795f2a465c06a67dc0c23e499c248c2cc1c"} Feb 19 10:04:04 crc kubenswrapper[4873]: I0219 10:04:04.766997 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c0df7963-e78f-457c-a27f-45c26232cfa7","Type":"ContainerStarted","Data":"1625a014d94b008c2147d0b553e57391c7230229827cf9bb223d147043ac205c"} Feb 19 10:04:04 crc kubenswrapper[4873]: I0219 10:04:04.789277 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8","Type":"ContainerStarted","Data":"bc2c5b3d4d716eb9c97dbd18c9c4f4216d977eab467433e514f9426fc5b0d5d5"} Feb 19 10:04:04 crc kubenswrapper[4873]: I0219 10:04:04.789325 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8","Type":"ContainerStarted","Data":"9f70d4dce463dab5272327a4c241561e484ce7f47d746707db1ef00e991c55a3"} Feb 19 10:04:05 crc kubenswrapper[4873]: I0219 10:04:05.798280 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c0df7963-e78f-457c-a27f-45c26232cfa7","Type":"ContainerStarted","Data":"69a8da314136c9cdd7ff53405c50ce004567d1311733a198b9e9331621ea2eea"} Feb 19 10:04:05 crc kubenswrapper[4873]: I0219 10:04:05.802506 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8","Type":"ContainerStarted","Data":"6c30461b33ded90ce2556c22f47ea9a3bb617a8a47e5f7bb60658aaae8782b55"} Feb 19 10:04:05 crc kubenswrapper[4873]: I0219 10:04:05.833393 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.83336859 podStartE2EDuration="2.83336859s" podCreationTimestamp="2026-02-19 10:04:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:04:05.819049842 +0000 UTC m=+1155.108481480" watchObservedRunningTime="2026-02-19 10:04:05.83336859 +0000 UTC m=+1155.122800248" Feb 19 10:04:08 crc kubenswrapper[4873]: I0219 10:04:08.566006 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:04:08 crc kubenswrapper[4873]: I0219 10:04:08.566795 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" containerName="glance-log" containerID="cri-o://00e17aa3c77a8dac057b0211e38bac6faa2ba84727374dd1e7825f5c8cd0363b" gracePeriod=30 Feb 19 10:04:08 crc kubenswrapper[4873]: I0219 10:04:08.566929 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" containerName="glance-httpd" containerID="cri-o://4ba322d3698975ce137f4a01e18bf101e9ef8127a707662eba0de8fcfb0c7ffe" gracePeriod=30 Feb 19 10:04:08 crc kubenswrapper[4873]: I0219 10:04:08.835082 4873 generic.go:334] "Generic (PLEG): container finished" podID="1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" containerID="00e17aa3c77a8dac057b0211e38bac6faa2ba84727374dd1e7825f5c8cd0363b" exitCode=143 Feb 19 10:04:08 crc kubenswrapper[4873]: I0219 10:04:08.835143 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e","Type":"ContainerDied","Data":"00e17aa3c77a8dac057b0211e38bac6faa2ba84727374dd1e7825f5c8cd0363b"} Feb 19 10:04:08 crc kubenswrapper[4873]: I0219 10:04:08.838585 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8","Type":"ContainerStarted","Data":"260b2fefe2fc420bc547089ed503be33da9e38484f16bfd89fac4795ad82a3ee"} Feb 19 10:04:08 crc kubenswrapper[4873]: I0219 10:04:08.839033 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 10:04:08 crc kubenswrapper[4873]: I0219 10:04:08.860011 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.494011087 podStartE2EDuration="6.85998908s" podCreationTimestamp="2026-02-19 10:04:02 +0000 UTC" firstStartedPulling="2026-02-19 10:04:03.593220479 +0000 UTC m=+1152.882652117" lastFinishedPulling="2026-02-19 10:04:07.959198472 +0000 UTC m=+1157.248630110" observedRunningTime="2026-02-19 10:04:08.857557639 +0000 UTC m=+1158.146989287" watchObservedRunningTime="2026-02-19 10:04:08.85998908 +0000 UTC m=+1158.149420728" Feb 19 10:04:09 crc kubenswrapper[4873]: I0219 10:04:09.913796 4873 generic.go:334] "Generic (PLEG): container finished" podID="1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" containerID="4ba322d3698975ce137f4a01e18bf101e9ef8127a707662eba0de8fcfb0c7ffe" exitCode=0 Feb 19 10:04:09 crc kubenswrapper[4873]: I0219 10:04:09.913871 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e","Type":"ContainerDied","Data":"4ba322d3698975ce137f4a01e18bf101e9ef8127a707662eba0de8fcfb0c7ffe"} Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.062378 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.158598 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggcss\" (UniqueName: \"kubernetes.io/projected/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-kube-api-access-ggcss\") pod \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.158651 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-scripts\") pod \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.158698 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-combined-ca-bundle\") pod \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.159360 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-logs\") pod \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.159463 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-config-data\") pod \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.159515 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-public-tls-certs\") pod \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.159560 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.159584 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-httpd-run\") pod \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\" (UID: \"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e\") " Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.159742 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-logs" (OuterVolumeSpecName: "logs") pod "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" (UID: "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.160055 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.160361 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" (UID: "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.170046 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-kube-api-access-ggcss" (OuterVolumeSpecName: "kube-api-access-ggcss") pod "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" (UID: "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e"). InnerVolumeSpecName "kube-api-access-ggcss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.175281 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-scripts" (OuterVolumeSpecName: "scripts") pod "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" (UID: "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.193441 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" (UID: "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.215120 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" (UID: "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.244587 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-config-data" (OuterVolumeSpecName: "config-data") pod "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" (UID: "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.244647 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" (UID: "1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.261563 4873 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.261589 4873 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.261600 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggcss\" (UniqueName: \"kubernetes.io/projected/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-kube-api-access-ggcss\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.261610 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.261618 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.261626 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.261636 4873 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.283649 4873 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.363133 4873 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.924606 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e","Type":"ContainerDied","Data":"d67a114aa214902359c1e29c718493f6dd023a96ca4f5d6261a47eb5f1d136c6"} Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.924652 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.924667 4873 scope.go:117] "RemoveContainer" containerID="4ba322d3698975ce137f4a01e18bf101e9ef8127a707662eba0de8fcfb0c7ffe" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.957250 4873 scope.go:117] "RemoveContainer" containerID="00e17aa3c77a8dac057b0211e38bac6faa2ba84727374dd1e7825f5c8cd0363b" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.957512 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.968362 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.984422 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:04:10 crc kubenswrapper[4873]: E0219 10:04:10.984833 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" containerName="glance-httpd" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.984849 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" containerName="glance-httpd" Feb 19 10:04:10 crc kubenswrapper[4873]: E0219 10:04:10.984866 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" containerName="glance-log" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.984874 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" containerName="glance-log" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.985062 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" containerName="glance-log" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.985082 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" containerName="glance-httpd" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.986081 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.990525 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 10:04:10 crc kubenswrapper[4873]: I0219 10:04:10.990710 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.017038 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.073674 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r58zd\" (UniqueName: \"kubernetes.io/projected/09cfd898-398f-41ae-8c45-1ed215b69683-kube-api-access-r58zd\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.073742 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-config-data\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.073766 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.073784 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-scripts\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.073799 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09cfd898-398f-41ae-8c45-1ed215b69683-logs\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.073827 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09cfd898-398f-41ae-8c45-1ed215b69683-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.073895 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.073912 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.175884 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r58zd\" (UniqueName: \"kubernetes.io/projected/09cfd898-398f-41ae-8c45-1ed215b69683-kube-api-access-r58zd\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.175966 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-config-data\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.175992 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.176015 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-scripts\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.176037 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09cfd898-398f-41ae-8c45-1ed215b69683-logs\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.176087 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09cfd898-398f-41ae-8c45-1ed215b69683-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.176198 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.176223 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.176834 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.177008 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09cfd898-398f-41ae-8c45-1ed215b69683-logs\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.177025 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09cfd898-398f-41ae-8c45-1ed215b69683-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.181738 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.181808 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-scripts\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.182411 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-config-data\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.192989 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r58zd\" (UniqueName: \"kubernetes.io/projected/09cfd898-398f-41ae-8c45-1ed215b69683-kube-api-access-r58zd\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.193819 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09cfd898-398f-41ae-8c45-1ed215b69683-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.206334 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"09cfd898-398f-41ae-8c45-1ed215b69683\") " pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.318712 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.356848 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.357208 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="ceilometer-central-agent" containerID="cri-o://9f70d4dce463dab5272327a4c241561e484ce7f47d746707db1ef00e991c55a3" gracePeriod=30 Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.357642 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="proxy-httpd" containerID="cri-o://260b2fefe2fc420bc547089ed503be33da9e38484f16bfd89fac4795ad82a3ee" gracePeriod=30 Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.357693 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="sg-core" containerID="cri-o://6c30461b33ded90ce2556c22f47ea9a3bb617a8a47e5f7bb60658aaae8782b55" gracePeriod=30 Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.357731 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="ceilometer-notification-agent" containerID="cri-o://bc2c5b3d4d716eb9c97dbd18c9c4f4216d977eab467433e514f9426fc5b0d5d5" gracePeriod=30 Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.536796 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e" path="/var/lib/kubelet/pods/1da8b72b-fdc0-4c00-a1da-cdb5e8e04e8e/volumes" Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.926433 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.948011 4873 generic.go:334] "Generic (PLEG): container finished" podID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerID="260b2fefe2fc420bc547089ed503be33da9e38484f16bfd89fac4795ad82a3ee" exitCode=0 Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.948056 4873 generic.go:334] "Generic (PLEG): container finished" podID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerID="6c30461b33ded90ce2556c22f47ea9a3bb617a8a47e5f7bb60658aaae8782b55" exitCode=2 Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.948085 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8","Type":"ContainerDied","Data":"260b2fefe2fc420bc547089ed503be33da9e38484f16bfd89fac4795ad82a3ee"} Feb 19 10:04:11 crc kubenswrapper[4873]: I0219 10:04:11.948132 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8","Type":"ContainerDied","Data":"6c30461b33ded90ce2556c22f47ea9a3bb617a8a47e5f7bb60658aaae8782b55"} Feb 19 10:04:12 crc kubenswrapper[4873]: I0219 10:04:12.959844 4873 generic.go:334] "Generic (PLEG): container finished" podID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerID="bc2c5b3d4d716eb9c97dbd18c9c4f4216d977eab467433e514f9426fc5b0d5d5" exitCode=0 Feb 19 10:04:12 crc kubenswrapper[4873]: I0219 10:04:12.960003 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8","Type":"ContainerDied","Data":"bc2c5b3d4d716eb9c97dbd18c9c4f4216d977eab467433e514f9426fc5b0d5d5"} Feb 19 10:04:12 crc kubenswrapper[4873]: I0219 10:04:12.961539 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09cfd898-398f-41ae-8c45-1ed215b69683","Type":"ContainerStarted","Data":"3ce17d2d97b2dc6fe772c16831da4ec0b873c6b5ac2967b58cd02540d9bfbe45"} Feb 19 10:04:12 crc kubenswrapper[4873]: I0219 10:04:12.961565 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09cfd898-398f-41ae-8c45-1ed215b69683","Type":"ContainerStarted","Data":"fb5ed99e6fd5bdd21c675a8343e48db30ee86490a2c63b9c1e93da65969c48b3"} Feb 19 10:04:13 crc kubenswrapper[4873]: I0219 10:04:13.453484 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:13 crc kubenswrapper[4873]: I0219 10:04:13.453552 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:13 crc kubenswrapper[4873]: I0219 10:04:13.495776 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:13 crc kubenswrapper[4873]: I0219 10:04:13.498308 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:13 crc kubenswrapper[4873]: I0219 10:04:13.976567 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09cfd898-398f-41ae-8c45-1ed215b69683","Type":"ContainerStarted","Data":"1ea2be76674c1bf99146edf11fcce7cce9a4f0501326fefbfc0f8f89106f3f7a"} Feb 19 10:04:13 crc kubenswrapper[4873]: I0219 10:04:13.978462 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:13 crc kubenswrapper[4873]: I0219 10:04:13.978496 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:14 crc kubenswrapper[4873]: I0219 10:04:14.001557 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.00154145 podStartE2EDuration="4.00154145s" podCreationTimestamp="2026-02-19 10:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:04:13.998432112 +0000 UTC m=+1163.287863750" watchObservedRunningTime="2026-02-19 10:04:14.00154145 +0000 UTC m=+1163.290973088" Feb 19 10:04:15 crc kubenswrapper[4873]: I0219 10:04:15.767875 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:15 crc kubenswrapper[4873]: I0219 10:04:15.770615 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:15.999936 4873 generic.go:334] "Generic (PLEG): container finished" podID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerID="9f70d4dce463dab5272327a4c241561e484ce7f47d746707db1ef00e991c55a3" exitCode=0 Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.000083 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8","Type":"ContainerDied","Data":"9f70d4dce463dab5272327a4c241561e484ce7f47d746707db1ef00e991c55a3"} Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.237237 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.393732 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-config-data\") pod \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.393798 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptdn9\" (UniqueName: \"kubernetes.io/projected/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-kube-api-access-ptdn9\") pod \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.393837 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-log-httpd\") pod \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.393868 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-combined-ca-bundle\") pod \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.393909 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-sg-core-conf-yaml\") pod \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.393931 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-scripts\") pod \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.393959 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-run-httpd\") pod \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\" (UID: \"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8\") " Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.394953 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" (UID: "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.397860 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" (UID: "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.400690 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-scripts" (OuterVolumeSpecName: "scripts") pod "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" (UID: "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.401026 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-kube-api-access-ptdn9" (OuterVolumeSpecName: "kube-api-access-ptdn9") pod "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" (UID: "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8"). InnerVolumeSpecName "kube-api-access-ptdn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.430094 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" (UID: "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.474396 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" (UID: "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.496961 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.496997 4873 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.497010 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.497023 4873 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.497034 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptdn9\" (UniqueName: \"kubernetes.io/projected/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-kube-api-access-ptdn9\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.497048 4873 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.501345 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-config-data" (OuterVolumeSpecName: "config-data") pod "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" (UID: "8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:16 crc kubenswrapper[4873]: I0219 10:04:16.599413 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.014324 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8","Type":"ContainerDied","Data":"ef43583f8104679c840d249a101e9c9f6c6b978a9eee554010eb7c86975dede9"} Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.014831 4873 scope.go:117] "RemoveContainer" containerID="260b2fefe2fc420bc547089ed503be33da9e38484f16bfd89fac4795ad82a3ee" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.014403 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.041698 4873 scope.go:117] "RemoveContainer" containerID="6c30461b33ded90ce2556c22f47ea9a3bb617a8a47e5f7bb60658aaae8782b55" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.087161 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.104482 4873 scope.go:117] "RemoveContainer" containerID="bc2c5b3d4d716eb9c97dbd18c9c4f4216d977eab467433e514f9426fc5b0d5d5" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.112641 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.123595 4873 scope.go:117] "RemoveContainer" containerID="9f70d4dce463dab5272327a4c241561e484ce7f47d746707db1ef00e991c55a3" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.125880 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:17 crc kubenswrapper[4873]: E0219 10:04:17.126218 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="ceilometer-notification-agent" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.126234 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="ceilometer-notification-agent" Feb 19 10:04:17 crc kubenswrapper[4873]: E0219 10:04:17.126252 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="sg-core" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.126260 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="sg-core" Feb 19 10:04:17 crc kubenswrapper[4873]: E0219 10:04:17.126272 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="ceilometer-central-agent" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.126278 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="ceilometer-central-agent" Feb 19 10:04:17 crc kubenswrapper[4873]: E0219 10:04:17.126296 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="proxy-httpd" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.126301 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="proxy-httpd" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.126471 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="ceilometer-notification-agent" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.126485 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="proxy-httpd" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.126495 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="sg-core" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.126511 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" containerName="ceilometer-central-agent" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.128887 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.134440 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.134836 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.143692 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.315357 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-config-data\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.315404 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-run-httpd\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.315450 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.315541 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbx52\" (UniqueName: \"kubernetes.io/projected/e037b85a-1abe-41da-a113-59129451f35f-kube-api-access-dbx52\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.315588 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.315699 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-log-httpd\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.315744 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-scripts\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.416928 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.417289 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbx52\" (UniqueName: \"kubernetes.io/projected/e037b85a-1abe-41da-a113-59129451f35f-kube-api-access-dbx52\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.417328 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.417915 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-log-httpd\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.417415 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-log-httpd\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.417985 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-scripts\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.418063 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-config-data\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.418096 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-run-httpd\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.418443 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-run-httpd\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.426294 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-scripts\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.427901 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.428552 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-config-data\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.430163 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.440136 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbx52\" (UniqueName: \"kubernetes.io/projected/e037b85a-1abe-41da-a113-59129451f35f-kube-api-access-dbx52\") pod \"ceilometer-0\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.463385 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.499674 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8" path="/var/lib/kubelet/pods/8efa5d9d-e97a-4fef-bb67-af8bdbe7b3f8/volumes" Feb 19 10:04:17 crc kubenswrapper[4873]: I0219 10:04:17.900419 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:17 crc kubenswrapper[4873]: W0219 10:04:17.911398 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode037b85a_1abe_41da_a113_59129451f35f.slice/crio-14e5ec3f770f34a8b125fd308c847df1b6b0992739a9c9f49371a6a91090b5cf WatchSource:0}: Error finding container 14e5ec3f770f34a8b125fd308c847df1b6b0992739a9c9f49371a6a91090b5cf: Status 404 returned error can't find the container with id 14e5ec3f770f34a8b125fd308c847df1b6b0992739a9c9f49371a6a91090b5cf Feb 19 10:04:18 crc kubenswrapper[4873]: I0219 10:04:18.025690 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e037b85a-1abe-41da-a113-59129451f35f","Type":"ContainerStarted","Data":"14e5ec3f770f34a8b125fd308c847df1b6b0992739a9c9f49371a6a91090b5cf"} Feb 19 10:04:18 crc kubenswrapper[4873]: I0219 10:04:18.240174 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:04:18 crc kubenswrapper[4873]: I0219 10:04:18.240476 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:04:18 crc kubenswrapper[4873]: I0219 10:04:18.240525 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:04:18 crc kubenswrapper[4873]: I0219 10:04:18.241074 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4cf449f514dc24e840144e6f6decb8f1a064252cdbd9c34d791686fe659362f0"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:04:18 crc kubenswrapper[4873]: I0219 10:04:18.241168 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://4cf449f514dc24e840144e6f6decb8f1a064252cdbd9c34d791686fe659362f0" gracePeriod=600 Feb 19 10:04:19 crc kubenswrapper[4873]: I0219 10:04:19.037865 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e037b85a-1abe-41da-a113-59129451f35f","Type":"ContainerStarted","Data":"3c42726335cd97f7c97751f39a53a6a1c3ede5f0168be405b1c1c8f5ccc72a65"} Feb 19 10:04:19 crc kubenswrapper[4873]: I0219 10:04:19.038240 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e037b85a-1abe-41da-a113-59129451f35f","Type":"ContainerStarted","Data":"0a629e4ae5a7dbd420903dd59abab426b5d7e8df23fe2344c827772936600ccb"} Feb 19 10:04:19 crc kubenswrapper[4873]: I0219 10:04:19.041049 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="4cf449f514dc24e840144e6f6decb8f1a064252cdbd9c34d791686fe659362f0" exitCode=0 Feb 19 10:04:19 crc kubenswrapper[4873]: I0219 10:04:19.041093 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"4cf449f514dc24e840144e6f6decb8f1a064252cdbd9c34d791686fe659362f0"} Feb 19 10:04:19 crc kubenswrapper[4873]: I0219 10:04:19.041127 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"fe114037dbb1e5c10911ab253f48b67258ca8f08b33d891b20892a3cde8544ad"} Feb 19 10:04:19 crc kubenswrapper[4873]: I0219 10:04:19.041141 4873 scope.go:117] "RemoveContainer" containerID="4e9052ea8663914dbd7738866b6f51c9865aab9ba0562919ffd7db3fb01e7ded" Feb 19 10:04:20 crc kubenswrapper[4873]: I0219 10:04:20.051730 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e037b85a-1abe-41da-a113-59129451f35f","Type":"ContainerStarted","Data":"6cc5365f6277dce3c3167a3985183e662248a1d4d6e0a45883e0d5e65cacc65f"} Feb 19 10:04:21 crc kubenswrapper[4873]: I0219 10:04:21.319602 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 10:04:21 crc kubenswrapper[4873]: I0219 10:04:21.319881 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 10:04:21 crc kubenswrapper[4873]: I0219 10:04:21.352798 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 10:04:21 crc kubenswrapper[4873]: I0219 10:04:21.372462 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 10:04:22 crc kubenswrapper[4873]: I0219 10:04:22.075674 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e037b85a-1abe-41da-a113-59129451f35f","Type":"ContainerStarted","Data":"e34e712b4e7100f172666d402654e7bfd335c81523cf9222b376f6ba0a3e3352"} Feb 19 10:04:22 crc kubenswrapper[4873]: I0219 10:04:22.075997 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 10:04:22 crc kubenswrapper[4873]: I0219 10:04:22.077709 4873 generic.go:334] "Generic (PLEG): container finished" podID="2f8fe617-c1d5-41f8-a23a-eeb88444f620" containerID="e9c86902c9a53b767e99a6b86b96ed298fea1e2244a0785fd8c7eeb7d4f69fa7" exitCode=0 Feb 19 10:04:22 crc kubenswrapper[4873]: I0219 10:04:22.080163 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgm8t" event={"ID":"2f8fe617-c1d5-41f8-a23a-eeb88444f620","Type":"ContainerDied","Data":"e9c86902c9a53b767e99a6b86b96ed298fea1e2244a0785fd8c7eeb7d4f69fa7"} Feb 19 10:04:22 crc kubenswrapper[4873]: I0219 10:04:22.080200 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 10:04:22 crc kubenswrapper[4873]: I0219 10:04:22.080321 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 10:04:22 crc kubenswrapper[4873]: I0219 10:04:22.126633 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.70887426 podStartE2EDuration="5.126611411s" podCreationTimestamp="2026-02-19 10:04:17 +0000 UTC" firstStartedPulling="2026-02-19 10:04:17.91350922 +0000 UTC m=+1167.202940858" lastFinishedPulling="2026-02-19 10:04:21.331246371 +0000 UTC m=+1170.620678009" observedRunningTime="2026-02-19 10:04:22.107529254 +0000 UTC m=+1171.396960892" watchObservedRunningTime="2026-02-19 10:04:22.126611411 +0000 UTC m=+1171.416043049" Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.519455 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.644733 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-scripts\") pod \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.645170 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-combined-ca-bundle\") pod \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.645348 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d74xc\" (UniqueName: \"kubernetes.io/projected/2f8fe617-c1d5-41f8-a23a-eeb88444f620-kube-api-access-d74xc\") pod \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.645422 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-config-data\") pod \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\" (UID: \"2f8fe617-c1d5-41f8-a23a-eeb88444f620\") " Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.654263 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f8fe617-c1d5-41f8-a23a-eeb88444f620-kube-api-access-d74xc" (OuterVolumeSpecName: "kube-api-access-d74xc") pod "2f8fe617-c1d5-41f8-a23a-eeb88444f620" (UID: "2f8fe617-c1d5-41f8-a23a-eeb88444f620"). InnerVolumeSpecName "kube-api-access-d74xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.654377 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-scripts" (OuterVolumeSpecName: "scripts") pod "2f8fe617-c1d5-41f8-a23a-eeb88444f620" (UID: "2f8fe617-c1d5-41f8-a23a-eeb88444f620"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.719474 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f8fe617-c1d5-41f8-a23a-eeb88444f620" (UID: "2f8fe617-c1d5-41f8-a23a-eeb88444f620"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.749543 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.749588 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.749603 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d74xc\" (UniqueName: \"kubernetes.io/projected/2f8fe617-c1d5-41f8-a23a-eeb88444f620-kube-api-access-d74xc\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.805997 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-config-data" (OuterVolumeSpecName: "config-data") pod "2f8fe617-c1d5-41f8-a23a-eeb88444f620" (UID: "2f8fe617-c1d5-41f8-a23a-eeb88444f620"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:23 crc kubenswrapper[4873]: I0219 10:04:23.851642 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8fe617-c1d5-41f8-a23a-eeb88444f620-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.096242 4873 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.096255 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgm8t" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.096262 4873 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.096247 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgm8t" event={"ID":"2f8fe617-c1d5-41f8-a23a-eeb88444f620","Type":"ContainerDied","Data":"5c51d1e42a2baea6c1a9d92c8fcf55ee9de4da189a67e10f9dca665987216a5f"} Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.096470 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c51d1e42a2baea6c1a9d92c8fcf55ee9de4da189a67e10f9dca665987216a5f" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.228566 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 10:04:24 crc kubenswrapper[4873]: E0219 10:04:24.229239 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8fe617-c1d5-41f8-a23a-eeb88444f620" containerName="nova-cell0-conductor-db-sync" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.229256 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8fe617-c1d5-41f8-a23a-eeb88444f620" containerName="nova-cell0-conductor-db-sync" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.229422 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f8fe617-c1d5-41f8-a23a-eeb88444f620" containerName="nova-cell0-conductor-db-sync" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.230044 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.231796 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.231950 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-c85mr" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.252264 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.363404 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25b9f1f-0533-4e00-a926-08639b1b2266-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c25b9f1f-0533-4e00-a926-08639b1b2266\") " pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.363877 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg566\" (UniqueName: \"kubernetes.io/projected/c25b9f1f-0533-4e00-a926-08639b1b2266-kube-api-access-dg566\") pod \"nova-cell0-conductor-0\" (UID: \"c25b9f1f-0533-4e00-a926-08639b1b2266\") " pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.363956 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25b9f1f-0533-4e00-a926-08639b1b2266-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c25b9f1f-0533-4e00-a926-08639b1b2266\") " pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.411348 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.465564 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg566\" (UniqueName: \"kubernetes.io/projected/c25b9f1f-0533-4e00-a926-08639b1b2266-kube-api-access-dg566\") pod \"nova-cell0-conductor-0\" (UID: \"c25b9f1f-0533-4e00-a926-08639b1b2266\") " pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.465639 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25b9f1f-0533-4e00-a926-08639b1b2266-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c25b9f1f-0533-4e00-a926-08639b1b2266\") " pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.465697 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25b9f1f-0533-4e00-a926-08639b1b2266-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c25b9f1f-0533-4e00-a926-08639b1b2266\") " pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.471553 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c25b9f1f-0533-4e00-a926-08639b1b2266-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c25b9f1f-0533-4e00-a926-08639b1b2266\") " pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.472067 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c25b9f1f-0533-4e00-a926-08639b1b2266-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c25b9f1f-0533-4e00-a926-08639b1b2266\") " pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.491177 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg566\" (UniqueName: \"kubernetes.io/projected/c25b9f1f-0533-4e00-a926-08639b1b2266-kube-api-access-dg566\") pod \"nova-cell0-conductor-0\" (UID: \"c25b9f1f-0533-4e00-a926-08639b1b2266\") " pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.574901 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:24 crc kubenswrapper[4873]: I0219 10:04:24.875969 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 10:04:25 crc kubenswrapper[4873]: I0219 10:04:25.039167 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 10:04:25 crc kubenswrapper[4873]: I0219 10:04:25.108647 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c25b9f1f-0533-4e00-a926-08639b1b2266","Type":"ContainerStarted","Data":"5186d19991d4137e906d0001a7a43a897ccefac9bc9021e4c650a7bca263ec2f"} Feb 19 10:04:25 crc kubenswrapper[4873]: I0219 10:04:25.959844 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:25 crc kubenswrapper[4873]: I0219 10:04:25.960374 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="ceilometer-central-agent" containerID="cri-o://0a629e4ae5a7dbd420903dd59abab426b5d7e8df23fe2344c827772936600ccb" gracePeriod=30 Feb 19 10:04:25 crc kubenswrapper[4873]: I0219 10:04:25.960442 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="proxy-httpd" containerID="cri-o://e34e712b4e7100f172666d402654e7bfd335c81523cf9222b376f6ba0a3e3352" gracePeriod=30 Feb 19 10:04:25 crc kubenswrapper[4873]: I0219 10:04:25.960442 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="sg-core" containerID="cri-o://6cc5365f6277dce3c3167a3985183e662248a1d4d6e0a45883e0d5e65cacc65f" gracePeriod=30 Feb 19 10:04:25 crc kubenswrapper[4873]: I0219 10:04:25.960481 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="ceilometer-notification-agent" containerID="cri-o://3c42726335cd97f7c97751f39a53a6a1c3ede5f0168be405b1c1c8f5ccc72a65" gracePeriod=30 Feb 19 10:04:26 crc kubenswrapper[4873]: I0219 10:04:26.119485 4873 generic.go:334] "Generic (PLEG): container finished" podID="e037b85a-1abe-41da-a113-59129451f35f" containerID="e34e712b4e7100f172666d402654e7bfd335c81523cf9222b376f6ba0a3e3352" exitCode=0 Feb 19 10:04:26 crc kubenswrapper[4873]: I0219 10:04:26.119797 4873 generic.go:334] "Generic (PLEG): container finished" podID="e037b85a-1abe-41da-a113-59129451f35f" containerID="6cc5365f6277dce3c3167a3985183e662248a1d4d6e0a45883e0d5e65cacc65f" exitCode=2 Feb 19 10:04:26 crc kubenswrapper[4873]: I0219 10:04:26.119568 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e037b85a-1abe-41da-a113-59129451f35f","Type":"ContainerDied","Data":"e34e712b4e7100f172666d402654e7bfd335c81523cf9222b376f6ba0a3e3352"} Feb 19 10:04:26 crc kubenswrapper[4873]: I0219 10:04:26.119864 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e037b85a-1abe-41da-a113-59129451f35f","Type":"ContainerDied","Data":"6cc5365f6277dce3c3167a3985183e662248a1d4d6e0a45883e0d5e65cacc65f"} Feb 19 10:04:26 crc kubenswrapper[4873]: I0219 10:04:26.121985 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c25b9f1f-0533-4e00-a926-08639b1b2266","Type":"ContainerStarted","Data":"46e88f8a01f1bbc0f588e2ef530b6b2c65675b7bdc761574c9c73fdc8cdb8472"} Feb 19 10:04:26 crc kubenswrapper[4873]: I0219 10:04:26.122179 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:26 crc kubenswrapper[4873]: I0219 10:04:26.142758 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.142737956 podStartE2EDuration="2.142737956s" podCreationTimestamp="2026-02-19 10:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:04:26.140976412 +0000 UTC m=+1175.430408050" watchObservedRunningTime="2026-02-19 10:04:26.142737956 +0000 UTC m=+1175.432169594" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.153224 4873 generic.go:334] "Generic (PLEG): container finished" podID="e037b85a-1abe-41da-a113-59129451f35f" containerID="3c42726335cd97f7c97751f39a53a6a1c3ede5f0168be405b1c1c8f5ccc72a65" exitCode=0 Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.153813 4873 generic.go:334] "Generic (PLEG): container finished" podID="e037b85a-1abe-41da-a113-59129451f35f" containerID="0a629e4ae5a7dbd420903dd59abab426b5d7e8df23fe2344c827772936600ccb" exitCode=0 Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.153277 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e037b85a-1abe-41da-a113-59129451f35f","Type":"ContainerDied","Data":"3c42726335cd97f7c97751f39a53a6a1c3ede5f0168be405b1c1c8f5ccc72a65"} Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.153886 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e037b85a-1abe-41da-a113-59129451f35f","Type":"ContainerDied","Data":"0a629e4ae5a7dbd420903dd59abab426b5d7e8df23fe2344c827772936600ccb"} Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.327828 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.442833 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbx52\" (UniqueName: \"kubernetes.io/projected/e037b85a-1abe-41da-a113-59129451f35f-kube-api-access-dbx52\") pod \"e037b85a-1abe-41da-a113-59129451f35f\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.442879 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-combined-ca-bundle\") pod \"e037b85a-1abe-41da-a113-59129451f35f\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.442915 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-config-data\") pod \"e037b85a-1abe-41da-a113-59129451f35f\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.442931 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-run-httpd\") pod \"e037b85a-1abe-41da-a113-59129451f35f\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.442991 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-sg-core-conf-yaml\") pod \"e037b85a-1abe-41da-a113-59129451f35f\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.443751 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-scripts\") pod \"e037b85a-1abe-41da-a113-59129451f35f\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.443831 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-log-httpd\") pod \"e037b85a-1abe-41da-a113-59129451f35f\" (UID: \"e037b85a-1abe-41da-a113-59129451f35f\") " Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.444319 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e037b85a-1abe-41da-a113-59129451f35f" (UID: "e037b85a-1abe-41da-a113-59129451f35f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.444378 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e037b85a-1abe-41da-a113-59129451f35f" (UID: "e037b85a-1abe-41da-a113-59129451f35f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.445356 4873 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.445542 4873 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e037b85a-1abe-41da-a113-59129451f35f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.449239 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-scripts" (OuterVolumeSpecName: "scripts") pod "e037b85a-1abe-41da-a113-59129451f35f" (UID: "e037b85a-1abe-41da-a113-59129451f35f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.452633 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e037b85a-1abe-41da-a113-59129451f35f-kube-api-access-dbx52" (OuterVolumeSpecName: "kube-api-access-dbx52") pod "e037b85a-1abe-41da-a113-59129451f35f" (UID: "e037b85a-1abe-41da-a113-59129451f35f"). InnerVolumeSpecName "kube-api-access-dbx52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.474445 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e037b85a-1abe-41da-a113-59129451f35f" (UID: "e037b85a-1abe-41da-a113-59129451f35f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.529525 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e037b85a-1abe-41da-a113-59129451f35f" (UID: "e037b85a-1abe-41da-a113-59129451f35f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.540995 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-config-data" (OuterVolumeSpecName: "config-data") pod "e037b85a-1abe-41da-a113-59129451f35f" (UID: "e037b85a-1abe-41da-a113-59129451f35f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.547262 4873 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.547286 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.547297 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbx52\" (UniqueName: \"kubernetes.io/projected/e037b85a-1abe-41da-a113-59129451f35f-kube-api-access-dbx52\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.547306 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:28 crc kubenswrapper[4873]: I0219 10:04:28.547314 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e037b85a-1abe-41da-a113-59129451f35f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.163604 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e037b85a-1abe-41da-a113-59129451f35f","Type":"ContainerDied","Data":"14e5ec3f770f34a8b125fd308c847df1b6b0992739a9c9f49371a6a91090b5cf"} Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.163652 4873 scope.go:117] "RemoveContainer" containerID="e34e712b4e7100f172666d402654e7bfd335c81523cf9222b376f6ba0a3e3352" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.163772 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.195792 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.198827 4873 scope.go:117] "RemoveContainer" containerID="6cc5365f6277dce3c3167a3985183e662248a1d4d6e0a45883e0d5e65cacc65f" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.211225 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.221906 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:29 crc kubenswrapper[4873]: E0219 10:04:29.222482 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="ceilometer-central-agent" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.222500 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="ceilometer-central-agent" Feb 19 10:04:29 crc kubenswrapper[4873]: E0219 10:04:29.222515 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="sg-core" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.222522 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="sg-core" Feb 19 10:04:29 crc kubenswrapper[4873]: E0219 10:04:29.222546 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="ceilometer-notification-agent" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.222554 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="ceilometer-notification-agent" Feb 19 10:04:29 crc kubenswrapper[4873]: E0219 10:04:29.222573 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="proxy-httpd" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.222582 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="proxy-httpd" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.222822 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="ceilometer-central-agent" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.222846 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="proxy-httpd" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.222858 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="ceilometer-notification-agent" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.222868 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e037b85a-1abe-41da-a113-59129451f35f" containerName="sg-core" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.224756 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.227451 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.227584 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.233511 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.266262 4873 scope.go:117] "RemoveContainer" containerID="3c42726335cd97f7c97751f39a53a6a1c3ede5f0168be405b1c1c8f5ccc72a65" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.286751 4873 scope.go:117] "RemoveContainer" containerID="0a629e4ae5a7dbd420903dd59abab426b5d7e8df23fe2344c827772936600ccb" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.360457 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-config-data\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.360709 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.360829 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.360930 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8fsk\" (UniqueName: \"kubernetes.io/projected/aaaed141-d989-4b6f-ad3b-aefe0952c823-kube-api-access-t8fsk\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.361010 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-log-httpd\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.361147 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-run-httpd\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.361246 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-scripts\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.463129 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-config-data\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.463420 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.463553 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.463741 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8fsk\" (UniqueName: \"kubernetes.io/projected/aaaed141-d989-4b6f-ad3b-aefe0952c823-kube-api-access-t8fsk\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.463824 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-log-httpd\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.463945 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-run-httpd\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.464029 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-scripts\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.464474 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-log-httpd\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.464543 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-run-httpd\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.467815 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-config-data\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.468094 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-scripts\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.469224 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.470645 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.478458 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8fsk\" (UniqueName: \"kubernetes.io/projected/aaaed141-d989-4b6f-ad3b-aefe0952c823-kube-api-access-t8fsk\") pod \"ceilometer-0\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " pod="openstack/ceilometer-0" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.497342 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e037b85a-1abe-41da-a113-59129451f35f" path="/var/lib/kubelet/pods/e037b85a-1abe-41da-a113-59129451f35f/volumes" Feb 19 10:04:29 crc kubenswrapper[4873]: I0219 10:04:29.566873 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:04:30 crc kubenswrapper[4873]: I0219 10:04:30.012202 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:04:30 crc kubenswrapper[4873]: I0219 10:04:30.018667 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 10:04:30 crc kubenswrapper[4873]: I0219 10:04:30.174300 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaaed141-d989-4b6f-ad3b-aefe0952c823","Type":"ContainerStarted","Data":"a10f8c50de08f100cd9a2d7823fcccf70bf26fcbe7aced261a97c5fa92b15f9b"} Feb 19 10:04:31 crc kubenswrapper[4873]: I0219 10:04:31.186246 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaaed141-d989-4b6f-ad3b-aefe0952c823","Type":"ContainerStarted","Data":"090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2"} Feb 19 10:04:31 crc kubenswrapper[4873]: I0219 10:04:31.186747 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaaed141-d989-4b6f-ad3b-aefe0952c823","Type":"ContainerStarted","Data":"6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3"} Feb 19 10:04:31 crc kubenswrapper[4873]: I0219 10:04:31.186758 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaaed141-d989-4b6f-ad3b-aefe0952c823","Type":"ContainerStarted","Data":"927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9"} Feb 19 10:04:33 crc kubenswrapper[4873]: I0219 10:04:33.207047 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaaed141-d989-4b6f-ad3b-aefe0952c823","Type":"ContainerStarted","Data":"08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a"} Feb 19 10:04:33 crc kubenswrapper[4873]: I0219 10:04:33.207626 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 10:04:33 crc kubenswrapper[4873]: I0219 10:04:33.246755 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.610104585 podStartE2EDuration="4.246734772s" podCreationTimestamp="2026-02-19 10:04:29 +0000 UTC" firstStartedPulling="2026-02-19 10:04:30.01845989 +0000 UTC m=+1179.307891528" lastFinishedPulling="2026-02-19 10:04:32.655090077 +0000 UTC m=+1181.944521715" observedRunningTime="2026-02-19 10:04:33.241741177 +0000 UTC m=+1182.531172815" watchObservedRunningTime="2026-02-19 10:04:33.246734772 +0000 UTC m=+1182.536166410" Feb 19 10:04:34 crc kubenswrapper[4873]: I0219 10:04:34.603630 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.093440 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xzlnz"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.095001 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.104653 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.104885 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.108744 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xzlnz"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.175201 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-scripts\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.175445 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6k84\" (UniqueName: \"kubernetes.io/projected/54b81c17-9130-4def-8021-e73168601bf6-kube-api-access-r6k84\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.175595 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-config-data\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.175723 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.277563 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-scripts\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.277899 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6k84\" (UniqueName: \"kubernetes.io/projected/54b81c17-9130-4def-8021-e73168601bf6-kube-api-access-r6k84\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.277968 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-config-data\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.277995 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.293435 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-scripts\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.293492 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-config-data\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.293895 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.303981 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6k84\" (UniqueName: \"kubernetes.io/projected/54b81c17-9130-4def-8021-e73168601bf6-kube-api-access-r6k84\") pod \"nova-cell0-cell-mapping-xzlnz\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.329165 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.330868 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.335027 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.367502 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.382572 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.382630 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-config-data\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.382662 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a513081d-764b-47d8-85d8-7019d7ea92ca-logs\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.382717 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnrln\" (UniqueName: \"kubernetes.io/projected/a513081d-764b-47d8-85d8-7019d7ea92ca-kube-api-access-gnrln\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.388376 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.394387 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.400533 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.416584 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.444661 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.499594 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-config-data\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.499656 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e8439c-3afb-4cde-b758-58871323cb9d-logs\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.499681 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.499734 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.499770 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-config-data\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.499788 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqrrj\" (UniqueName: \"kubernetes.io/projected/53e8439c-3afb-4cde-b758-58871323cb9d-kube-api-access-mqrrj\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.499816 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a513081d-764b-47d8-85d8-7019d7ea92ca-logs\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.499860 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnrln\" (UniqueName: \"kubernetes.io/projected/a513081d-764b-47d8-85d8-7019d7ea92ca-kube-api-access-gnrln\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.504554 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a513081d-764b-47d8-85d8-7019d7ea92ca-logs\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.507449 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-config-data\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.525588 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.533238 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnrln\" (UniqueName: \"kubernetes.io/projected/a513081d-764b-47d8-85d8-7019d7ea92ca-kube-api-access-gnrln\") pod \"nova-api-0\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.561405 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.562848 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.591781 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.592148 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.672118 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-config-data\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.672237 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e8439c-3afb-4cde-b758-58871323cb9d-logs\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.672303 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.672505 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqrrj\" (UniqueName: \"kubernetes.io/projected/53e8439c-3afb-4cde-b758-58871323cb9d-kube-api-access-mqrrj\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.674650 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.676708 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.695513 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e8439c-3afb-4cde-b758-58871323cb9d-logs\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.697359 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.697620 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.698899 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-config-data\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.703161 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59574c798f-md9g4"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.785428 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqrrj\" (UniqueName: \"kubernetes.io/projected/53e8439c-3afb-4cde-b758-58871323cb9d-kube-api-access-mqrrj\") pod \"nova-metadata-0\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " pod="openstack/nova-metadata-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.785553 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.787822 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.802543 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnrvh\" (UniqueName: \"kubernetes.io/projected/caad2a71-260f-41e5-99f0-532d73995f41-kube-api-access-jnrvh\") pod \"nova-scheduler-0\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.802786 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcmjb\" (UniqueName: \"kubernetes.io/projected/592b92b0-44a1-4386-8f2e-8a55633dedd8-kube-api-access-fcmjb\") pod \"nova-cell1-novncproxy-0\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.802826 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-nb\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.802847 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-swift-storage-0\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.802862 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-config-data\") pod \"nova-scheduler-0\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.802885 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-svc\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.802904 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-sb\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.802949 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.802968 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-config\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.802993 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmmp6\" (UniqueName: \"kubernetes.io/projected/561650f5-0705-4bab-903d-66bba11301ce-kube-api-access-wmmp6\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.803031 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.803046 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.815685 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.863214 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59574c798f-md9g4"] Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.903967 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-nb\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.904001 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-swift-storage-0\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.904018 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-config-data\") pod \"nova-scheduler-0\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.904040 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-svc\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.904060 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-sb\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.904078 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.909936 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-config\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.910018 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmmp6\" (UniqueName: \"kubernetes.io/projected/561650f5-0705-4bab-903d-66bba11301ce-kube-api-access-wmmp6\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.910122 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.910143 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.910258 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnrvh\" (UniqueName: \"kubernetes.io/projected/caad2a71-260f-41e5-99f0-532d73995f41-kube-api-access-jnrvh\") pod \"nova-scheduler-0\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.910326 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcmjb\" (UniqueName: \"kubernetes.io/projected/592b92b0-44a1-4386-8f2e-8a55633dedd8-kube-api-access-fcmjb\") pod \"nova-cell1-novncproxy-0\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.911521 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-svc\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.920166 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-sb\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.940996 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-swift-storage-0\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.941448 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-config\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.941525 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-nb\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.945400 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.946298 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-config-data\") pod \"nova-scheduler-0\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:35 crc kubenswrapper[4873]: I0219 10:04:35.959345 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.083459 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcmjb\" (UniqueName: \"kubernetes.io/projected/592b92b0-44a1-4386-8f2e-8a55633dedd8-kube-api-access-fcmjb\") pod \"nova-cell1-novncproxy-0\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.083962 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmmp6\" (UniqueName: \"kubernetes.io/projected/561650f5-0705-4bab-903d-66bba11301ce-kube-api-access-wmmp6\") pod \"dnsmasq-dns-59574c798f-md9g4\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.090603 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.095249 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.095574 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnrvh\" (UniqueName: \"kubernetes.io/projected/caad2a71-260f-41e5-99f0-532d73995f41-kube-api-access-jnrvh\") pod \"nova-scheduler-0\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.102734 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.181526 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.284600 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.364715 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xzlnz"] Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.717224 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.805734 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9z5nq"] Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.807432 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.813572 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.813839 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.817346 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9z5nq"] Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.846336 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.846437 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d48kf\" (UniqueName: \"kubernetes.io/projected/96fca831-509a-4abd-bb7e-2c0f4704368b-kube-api-access-d48kf\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.846558 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-config-data\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.846611 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-scripts\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.903590 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.935364 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.948265 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d48kf\" (UniqueName: \"kubernetes.io/projected/96fca831-509a-4abd-bb7e-2c0f4704368b-kube-api-access-d48kf\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.948366 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-config-data\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.948405 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-scripts\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.948461 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.970095 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-config-data\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.972201 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-scripts\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.972849 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:36 crc kubenswrapper[4873]: I0219 10:04:36.974463 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d48kf\" (UniqueName: \"kubernetes.io/projected/96fca831-509a-4abd-bb7e-2c0f4704368b-kube-api-access-d48kf\") pod \"nova-cell1-conductor-db-sync-9z5nq\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.064210 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.094761 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59574c798f-md9g4"] Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.155283 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.291993 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53e8439c-3afb-4cde-b758-58871323cb9d","Type":"ContainerStarted","Data":"cd7174e87fb58baa828b8b7cbbe39dc5f4224bb9b672908e6aee18a4d7cb73c7"} Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.297611 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a513081d-764b-47d8-85d8-7019d7ea92ca","Type":"ContainerStarted","Data":"d76a4cd46cc684d689557d9372222e68498dc6bce68b7bb1dbfe2b38a2ff9d6a"} Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.306506 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xzlnz" event={"ID":"54b81c17-9130-4def-8021-e73168601bf6","Type":"ContainerStarted","Data":"c163e1fbae8dfb18e81d4177d941e04ca8d149e8d88a196ee094871f3dd31d8c"} Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.306549 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xzlnz" event={"ID":"54b81c17-9130-4def-8021-e73168601bf6","Type":"ContainerStarted","Data":"68ee08e86d5531761dadfdc7d5bd0ceb0e4ff127a8a81f8599f657d66df48dcd"} Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.319763 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"caad2a71-260f-41e5-99f0-532d73995f41","Type":"ContainerStarted","Data":"6286e92557ca766dfcf83501ac2e45ef3faaab8e9e1c247dd572c05c5d0518c2"} Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.332552 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xzlnz" podStartSLOduration=2.332383255 podStartE2EDuration="2.332383255s" podCreationTimestamp="2026-02-19 10:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:04:37.327439991 +0000 UTC m=+1186.616871629" watchObservedRunningTime="2026-02-19 10:04:37.332383255 +0000 UTC m=+1186.621814893" Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.342815 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"592b92b0-44a1-4386-8f2e-8a55633dedd8","Type":"ContainerStarted","Data":"dd5cbf40dbe04c7e636d3d6dd6dd491656051f5351076f90a0afd9cb6afdd2b3"} Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.353163 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59574c798f-md9g4" event={"ID":"561650f5-0705-4bab-903d-66bba11301ce","Type":"ContainerStarted","Data":"36949a05a228a205fbf13f7609b5591909a30f0981a5fef6bf17ca7a531f1283"} Feb 19 10:04:37 crc kubenswrapper[4873]: I0219 10:04:37.842550 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9z5nq"] Feb 19 10:04:38 crc kubenswrapper[4873]: I0219 10:04:38.389203 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9z5nq" event={"ID":"96fca831-509a-4abd-bb7e-2c0f4704368b","Type":"ContainerStarted","Data":"fa528c871922eeb11614e9dc0af7c2459d78bffa25b2a2a86e1fa5e00eb6941c"} Feb 19 10:04:38 crc kubenswrapper[4873]: I0219 10:04:38.417324 4873 generic.go:334] "Generic (PLEG): container finished" podID="561650f5-0705-4bab-903d-66bba11301ce" containerID="0f9b9b7ee4ce408ec602f5938ab243185a9595459642fa3aa668d6fc66f1980f" exitCode=0 Feb 19 10:04:38 crc kubenswrapper[4873]: I0219 10:04:38.419583 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59574c798f-md9g4" event={"ID":"561650f5-0705-4bab-903d-66bba11301ce","Type":"ContainerDied","Data":"0f9b9b7ee4ce408ec602f5938ab243185a9595459642fa3aa668d6fc66f1980f"} Feb 19 10:04:39 crc kubenswrapper[4873]: I0219 10:04:39.120248 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 10:04:39 crc kubenswrapper[4873]: I0219 10:04:39.132507 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:39 crc kubenswrapper[4873]: I0219 10:04:39.427371 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9z5nq" event={"ID":"96fca831-509a-4abd-bb7e-2c0f4704368b","Type":"ContainerStarted","Data":"343f5f5d97db66db1963a29b53ef93078842c1069756343de5aa869eb8885cd9"} Feb 19 10:04:39 crc kubenswrapper[4873]: I0219 10:04:39.447064 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9z5nq" podStartSLOduration=3.447042819 podStartE2EDuration="3.447042819s" podCreationTimestamp="2026-02-19 10:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:04:39.442869454 +0000 UTC m=+1188.732301092" watchObservedRunningTime="2026-02-19 10:04:39.447042819 +0000 UTC m=+1188.736474457" Feb 19 10:04:40 crc kubenswrapper[4873]: I0219 10:04:40.438513 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59574c798f-md9g4" event={"ID":"561650f5-0705-4bab-903d-66bba11301ce","Type":"ContainerStarted","Data":"5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548"} Feb 19 10:04:40 crc kubenswrapper[4873]: I0219 10:04:40.439009 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:40 crc kubenswrapper[4873]: I0219 10:04:40.467973 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59574c798f-md9g4" podStartSLOduration=5.467952669 podStartE2EDuration="5.467952669s" podCreationTimestamp="2026-02-19 10:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:04:40.460034411 +0000 UTC m=+1189.749466069" watchObservedRunningTime="2026-02-19 10:04:40.467952669 +0000 UTC m=+1189.757384307" Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.462670 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"592b92b0-44a1-4386-8f2e-8a55633dedd8","Type":"ContainerStarted","Data":"d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3"} Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.463024 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="592b92b0-44a1-4386-8f2e-8a55633dedd8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3" gracePeriod=30 Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.466329 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53e8439c-3afb-4cde-b758-58871323cb9d","Type":"ContainerStarted","Data":"e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92"} Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.466369 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53e8439c-3afb-4cde-b758-58871323cb9d","Type":"ContainerStarted","Data":"43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3"} Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.466476 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53e8439c-3afb-4cde-b758-58871323cb9d" containerName="nova-metadata-log" containerID="cri-o://43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3" gracePeriod=30 Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.466570 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53e8439c-3afb-4cde-b758-58871323cb9d" containerName="nova-metadata-metadata" containerID="cri-o://e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92" gracePeriod=30 Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.470092 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a513081d-764b-47d8-85d8-7019d7ea92ca","Type":"ContainerStarted","Data":"b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49"} Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.470234 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a513081d-764b-47d8-85d8-7019d7ea92ca","Type":"ContainerStarted","Data":"bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde"} Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.475657 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"caad2a71-260f-41e5-99f0-532d73995f41","Type":"ContainerStarted","Data":"c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225"} Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.488985 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.283272375 podStartE2EDuration="7.488970111s" podCreationTimestamp="2026-02-19 10:04:35 +0000 UTC" firstStartedPulling="2026-02-19 10:04:37.078636979 +0000 UTC m=+1186.368068617" lastFinishedPulling="2026-02-19 10:04:41.284334715 +0000 UTC m=+1190.573766353" observedRunningTime="2026-02-19 10:04:42.488627813 +0000 UTC m=+1191.778059451" watchObservedRunningTime="2026-02-19 10:04:42.488970111 +0000 UTC m=+1191.778401749" Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.509195 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.957040656 podStartE2EDuration="7.509179306s" podCreationTimestamp="2026-02-19 10:04:35 +0000 UTC" firstStartedPulling="2026-02-19 10:04:36.72677169 +0000 UTC m=+1186.016203328" lastFinishedPulling="2026-02-19 10:04:41.27891032 +0000 UTC m=+1190.568341978" observedRunningTime="2026-02-19 10:04:42.5065378 +0000 UTC m=+1191.795969438" watchObservedRunningTime="2026-02-19 10:04:42.509179306 +0000 UTC m=+1191.798610944" Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.533336 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.174162597 podStartE2EDuration="7.533290499s" podCreationTimestamp="2026-02-19 10:04:35 +0000 UTC" firstStartedPulling="2026-02-19 10:04:36.922933046 +0000 UTC m=+1186.212364684" lastFinishedPulling="2026-02-19 10:04:41.282060948 +0000 UTC m=+1190.571492586" observedRunningTime="2026-02-19 10:04:42.531342271 +0000 UTC m=+1191.820773909" watchObservedRunningTime="2026-02-19 10:04:42.533290499 +0000 UTC m=+1191.822722127" Feb 19 10:04:42 crc kubenswrapper[4873]: I0219 10:04:42.562487 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.215551942 podStartE2EDuration="7.562470309s" podCreationTimestamp="2026-02-19 10:04:35 +0000 UTC" firstStartedPulling="2026-02-19 10:04:36.931689025 +0000 UTC m=+1186.221120663" lastFinishedPulling="2026-02-19 10:04:41.278607392 +0000 UTC m=+1190.568039030" observedRunningTime="2026-02-19 10:04:42.553081984 +0000 UTC m=+1191.842513622" watchObservedRunningTime="2026-02-19 10:04:42.562470309 +0000 UTC m=+1191.851901947" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.113869 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.212753 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqrrj\" (UniqueName: \"kubernetes.io/projected/53e8439c-3afb-4cde-b758-58871323cb9d-kube-api-access-mqrrj\") pod \"53e8439c-3afb-4cde-b758-58871323cb9d\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.212894 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-combined-ca-bundle\") pod \"53e8439c-3afb-4cde-b758-58871323cb9d\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.212921 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-config-data\") pod \"53e8439c-3afb-4cde-b758-58871323cb9d\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.213003 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e8439c-3afb-4cde-b758-58871323cb9d-logs\") pod \"53e8439c-3afb-4cde-b758-58871323cb9d\" (UID: \"53e8439c-3afb-4cde-b758-58871323cb9d\") " Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.213800 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53e8439c-3afb-4cde-b758-58871323cb9d-logs" (OuterVolumeSpecName: "logs") pod "53e8439c-3afb-4cde-b758-58871323cb9d" (UID: "53e8439c-3afb-4cde-b758-58871323cb9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.219794 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e8439c-3afb-4cde-b758-58871323cb9d-kube-api-access-mqrrj" (OuterVolumeSpecName: "kube-api-access-mqrrj") pod "53e8439c-3afb-4cde-b758-58871323cb9d" (UID: "53e8439c-3afb-4cde-b758-58871323cb9d"). InnerVolumeSpecName "kube-api-access-mqrrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.245005 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53e8439c-3afb-4cde-b758-58871323cb9d" (UID: "53e8439c-3afb-4cde-b758-58871323cb9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.248637 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-config-data" (OuterVolumeSpecName: "config-data") pod "53e8439c-3afb-4cde-b758-58871323cb9d" (UID: "53e8439c-3afb-4cde-b758-58871323cb9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.315349 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.315385 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e8439c-3afb-4cde-b758-58871323cb9d-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.315394 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqrrj\" (UniqueName: \"kubernetes.io/projected/53e8439c-3afb-4cde-b758-58871323cb9d-kube-api-access-mqrrj\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.315404 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e8439c-3afb-4cde-b758-58871323cb9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.486782 4873 generic.go:334] "Generic (PLEG): container finished" podID="53e8439c-3afb-4cde-b758-58871323cb9d" containerID="e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92" exitCode=0 Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.487070 4873 generic.go:334] "Generic (PLEG): container finished" podID="53e8439c-3afb-4cde-b758-58871323cb9d" containerID="43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3" exitCode=143 Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.487664 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.504393 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53e8439c-3afb-4cde-b758-58871323cb9d","Type":"ContainerDied","Data":"e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92"} Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.504440 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53e8439c-3afb-4cde-b758-58871323cb9d","Type":"ContainerDied","Data":"43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3"} Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.504496 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53e8439c-3afb-4cde-b758-58871323cb9d","Type":"ContainerDied","Data":"cd7174e87fb58baa828b8b7cbbe39dc5f4224bb9b672908e6aee18a4d7cb73c7"} Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.504519 4873 scope.go:117] "RemoveContainer" containerID="e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.547176 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.570343 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.570945 4873 scope.go:117] "RemoveContainer" containerID="43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.581204 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:43 crc kubenswrapper[4873]: E0219 10:04:43.581600 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e8439c-3afb-4cde-b758-58871323cb9d" containerName="nova-metadata-log" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.581638 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e8439c-3afb-4cde-b758-58871323cb9d" containerName="nova-metadata-log" Feb 19 10:04:43 crc kubenswrapper[4873]: E0219 10:04:43.581656 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e8439c-3afb-4cde-b758-58871323cb9d" containerName="nova-metadata-metadata" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.581662 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e8439c-3afb-4cde-b758-58871323cb9d" containerName="nova-metadata-metadata" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.581839 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e8439c-3afb-4cde-b758-58871323cb9d" containerName="nova-metadata-log" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.581855 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e8439c-3afb-4cde-b758-58871323cb9d" containerName="nova-metadata-metadata" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.582798 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.594493 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.594811 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.631078 4873 scope.go:117] "RemoveContainer" containerID="e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92" Feb 19 10:04:43 crc kubenswrapper[4873]: E0219 10:04:43.631890 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92\": container with ID starting with e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92 not found: ID does not exist" containerID="e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.632159 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92"} err="failed to get container status \"e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92\": rpc error: code = NotFound desc = could not find container \"e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92\": container with ID starting with e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92 not found: ID does not exist" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.632196 4873 scope.go:117] "RemoveContainer" containerID="43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3" Feb 19 10:04:43 crc kubenswrapper[4873]: E0219 10:04:43.633122 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3\": container with ID starting with 43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3 not found: ID does not exist" containerID="43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.633177 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3"} err="failed to get container status \"43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3\": rpc error: code = NotFound desc = could not find container \"43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3\": container with ID starting with 43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3 not found: ID does not exist" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.633211 4873 scope.go:117] "RemoveContainer" containerID="e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.633585 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92"} err="failed to get container status \"e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92\": rpc error: code = NotFound desc = could not find container \"e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92\": container with ID starting with e47c982444f3b7e8df76455de59692aef41c0f5dc5a6d65d28ae042a8818af92 not found: ID does not exist" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.633630 4873 scope.go:117] "RemoveContainer" containerID="43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.634034 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3"} err="failed to get container status \"43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3\": rpc error: code = NotFound desc = could not find container \"43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3\": container with ID starting with 43a616daef0cb7b8596108c40bf81ff6c3694874c913665dcb3bf8b7746c76c3 not found: ID does not exist" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.643399 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.732559 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knkwd\" (UniqueName: \"kubernetes.io/projected/a77215ef-e683-4030-9b93-5f30814f1158-kube-api-access-knkwd\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.733149 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-config-data\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.733405 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.733477 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a77215ef-e683-4030-9b93-5f30814f1158-logs\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.733511 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.835359 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knkwd\" (UniqueName: \"kubernetes.io/projected/a77215ef-e683-4030-9b93-5f30814f1158-kube-api-access-knkwd\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.835741 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-config-data\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.835785 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.835814 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a77215ef-e683-4030-9b93-5f30814f1158-logs\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.835836 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.836630 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a77215ef-e683-4030-9b93-5f30814f1158-logs\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.840572 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.840907 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.851731 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-config-data\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.853144 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knkwd\" (UniqueName: \"kubernetes.io/projected/a77215ef-e683-4030-9b93-5f30814f1158-kube-api-access-knkwd\") pod \"nova-metadata-0\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " pod="openstack/nova-metadata-0" Feb 19 10:04:43 crc kubenswrapper[4873]: I0219 10:04:43.931724 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:04:44 crc kubenswrapper[4873]: I0219 10:04:44.423757 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:44 crc kubenswrapper[4873]: I0219 10:04:44.501962 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a77215ef-e683-4030-9b93-5f30814f1158","Type":"ContainerStarted","Data":"ecd5ec7a0c35523bee927dcc16abb03decd8d6779d5d402a7592d31b466ab8a8"} Feb 19 10:04:45 crc kubenswrapper[4873]: I0219 10:04:45.699952 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53e8439c-3afb-4cde-b758-58871323cb9d" path="/var/lib/kubelet/pods/53e8439c-3afb-4cde-b758-58871323cb9d/volumes" Feb 19 10:04:45 crc kubenswrapper[4873]: I0219 10:04:45.703640 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a77215ef-e683-4030-9b93-5f30814f1158","Type":"ContainerStarted","Data":"0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796"} Feb 19 10:04:45 crc kubenswrapper[4873]: I0219 10:04:45.703683 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a77215ef-e683-4030-9b93-5f30814f1158","Type":"ContainerStarted","Data":"db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3"} Feb 19 10:04:45 crc kubenswrapper[4873]: I0219 10:04:45.744909 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.744879645 podStartE2EDuration="2.744879645s" podCreationTimestamp="2026-02-19 10:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:04:45.735786607 +0000 UTC m=+1195.025218285" watchObservedRunningTime="2026-02-19 10:04:45.744879645 +0000 UTC m=+1195.034311293" Feb 19 10:04:45 crc kubenswrapper[4873]: I0219 10:04:45.786697 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 10:04:45 crc kubenswrapper[4873]: I0219 10:04:45.786763 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.104490 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.104537 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.159429 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.185084 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.253190 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8656fdbcc7-6lw5c"] Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.253661 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" podUID="e78542dc-01da-47dc-aec5-a380b7484425" containerName="dnsmasq-dns" containerID="cri-o://89333ead7926aa2618b4a528f8056c909774aca95003b189eba1f9e1ae277295" gracePeriod=10 Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.286027 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.719541 4873 generic.go:334] "Generic (PLEG): container finished" podID="54b81c17-9130-4def-8021-e73168601bf6" containerID="c163e1fbae8dfb18e81d4177d941e04ca8d149e8d88a196ee094871f3dd31d8c" exitCode=0 Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.719619 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xzlnz" event={"ID":"54b81c17-9130-4def-8021-e73168601bf6","Type":"ContainerDied","Data":"c163e1fbae8dfb18e81d4177d941e04ca8d149e8d88a196ee094871f3dd31d8c"} Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.722400 4873 generic.go:334] "Generic (PLEG): container finished" podID="e78542dc-01da-47dc-aec5-a380b7484425" containerID="89333ead7926aa2618b4a528f8056c909774aca95003b189eba1f9e1ae277295" exitCode=0 Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.723298 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" event={"ID":"e78542dc-01da-47dc-aec5-a380b7484425","Type":"ContainerDied","Data":"89333ead7926aa2618b4a528f8056c909774aca95003b189eba1f9e1ae277295"} Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.767852 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.869557 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.869613 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.882359 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.922402 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwhw4\" (UniqueName: \"kubernetes.io/projected/e78542dc-01da-47dc-aec5-a380b7484425-kube-api-access-fwhw4\") pod \"e78542dc-01da-47dc-aec5-a380b7484425\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.922436 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-sb\") pod \"e78542dc-01da-47dc-aec5-a380b7484425\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.922524 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-svc\") pod \"e78542dc-01da-47dc-aec5-a380b7484425\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.922611 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-config\") pod \"e78542dc-01da-47dc-aec5-a380b7484425\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.922629 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-swift-storage-0\") pod \"e78542dc-01da-47dc-aec5-a380b7484425\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.922652 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-nb\") pod \"e78542dc-01da-47dc-aec5-a380b7484425\" (UID: \"e78542dc-01da-47dc-aec5-a380b7484425\") " Feb 19 10:04:46 crc kubenswrapper[4873]: I0219 10:04:46.941484 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78542dc-01da-47dc-aec5-a380b7484425-kube-api-access-fwhw4" (OuterVolumeSpecName: "kube-api-access-fwhw4") pod "e78542dc-01da-47dc-aec5-a380b7484425" (UID: "e78542dc-01da-47dc-aec5-a380b7484425"). InnerVolumeSpecName "kube-api-access-fwhw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.002730 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e78542dc-01da-47dc-aec5-a380b7484425" (UID: "e78542dc-01da-47dc-aec5-a380b7484425"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.018648 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e78542dc-01da-47dc-aec5-a380b7484425" (UID: "e78542dc-01da-47dc-aec5-a380b7484425"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.021585 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e78542dc-01da-47dc-aec5-a380b7484425" (UID: "e78542dc-01da-47dc-aec5-a380b7484425"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.026765 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.026815 4873 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.026830 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.026840 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwhw4\" (UniqueName: \"kubernetes.io/projected/e78542dc-01da-47dc-aec5-a380b7484425-kube-api-access-fwhw4\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.053584 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e78542dc-01da-47dc-aec5-a380b7484425" (UID: "e78542dc-01da-47dc-aec5-a380b7484425"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.066522 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-config" (OuterVolumeSpecName: "config") pod "e78542dc-01da-47dc-aec5-a380b7484425" (UID: "e78542dc-01da-47dc-aec5-a380b7484425"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.129048 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.129091 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e78542dc-01da-47dc-aec5-a380b7484425-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.733817 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" event={"ID":"e78542dc-01da-47dc-aec5-a380b7484425","Type":"ContainerDied","Data":"c7ffc8e18883ae90270b9d4c0dcb813698f920dfda3376430f83575ac81ce7b9"} Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.733872 4873 scope.go:117] "RemoveContainer" containerID="89333ead7926aa2618b4a528f8056c909774aca95003b189eba1f9e1ae277295" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.734880 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8656fdbcc7-6lw5c" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.768215 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8656fdbcc7-6lw5c"] Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.768397 4873 scope.go:117] "RemoveContainer" containerID="20fe864189fb33810eb3acc7dc0b89314091b0776fb2a2bfe18804bc13374185" Feb 19 10:04:47 crc kubenswrapper[4873]: I0219 10:04:47.783905 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8656fdbcc7-6lw5c"] Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.231427 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.358389 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6k84\" (UniqueName: \"kubernetes.io/projected/54b81c17-9130-4def-8021-e73168601bf6-kube-api-access-r6k84\") pod \"54b81c17-9130-4def-8021-e73168601bf6\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.358531 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-combined-ca-bundle\") pod \"54b81c17-9130-4def-8021-e73168601bf6\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.358695 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-scripts\") pod \"54b81c17-9130-4def-8021-e73168601bf6\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.358759 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-config-data\") pod \"54b81c17-9130-4def-8021-e73168601bf6\" (UID: \"54b81c17-9130-4def-8021-e73168601bf6\") " Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.368353 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-scripts" (OuterVolumeSpecName: "scripts") pod "54b81c17-9130-4def-8021-e73168601bf6" (UID: "54b81c17-9130-4def-8021-e73168601bf6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.368472 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b81c17-9130-4def-8021-e73168601bf6-kube-api-access-r6k84" (OuterVolumeSpecName: "kube-api-access-r6k84") pod "54b81c17-9130-4def-8021-e73168601bf6" (UID: "54b81c17-9130-4def-8021-e73168601bf6"). InnerVolumeSpecName "kube-api-access-r6k84". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.391161 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54b81c17-9130-4def-8021-e73168601bf6" (UID: "54b81c17-9130-4def-8021-e73168601bf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.397270 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-config-data" (OuterVolumeSpecName: "config-data") pod "54b81c17-9130-4def-8021-e73168601bf6" (UID: "54b81c17-9130-4def-8021-e73168601bf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.461252 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.461308 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6k84\" (UniqueName: \"kubernetes.io/projected/54b81c17-9130-4def-8021-e73168601bf6-kube-api-access-r6k84\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.461323 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.461334 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b81c17-9130-4def-8021-e73168601bf6-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.746002 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xzlnz" event={"ID":"54b81c17-9130-4def-8021-e73168601bf6","Type":"ContainerDied","Data":"68ee08e86d5531761dadfdc7d5bd0ceb0e4ff127a8a81f8599f657d66df48dcd"} Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.746029 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xzlnz" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.746045 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68ee08e86d5531761dadfdc7d5bd0ceb0e4ff127a8a81f8599f657d66df48dcd" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.933048 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 10:04:48 crc kubenswrapper[4873]: I0219 10:04:48.933163 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 10:04:49 crc kubenswrapper[4873]: I0219 10:04:49.016366 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:04:49 crc kubenswrapper[4873]: I0219 10:04:49.016612 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerName="nova-api-log" containerID="cri-o://bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde" gracePeriod=30 Feb 19 10:04:49 crc kubenswrapper[4873]: I0219 10:04:49.016716 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerName="nova-api-api" containerID="cri-o://b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49" gracePeriod=30 Feb 19 10:04:49 crc kubenswrapper[4873]: I0219 10:04:49.031742 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:04:49 crc kubenswrapper[4873]: I0219 10:04:49.031979 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="caad2a71-260f-41e5-99f0-532d73995f41" containerName="nova-scheduler-scheduler" containerID="cri-o://c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225" gracePeriod=30 Feb 19 10:04:49 crc kubenswrapper[4873]: I0219 10:04:49.054529 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:49 crc kubenswrapper[4873]: I0219 10:04:49.496052 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e78542dc-01da-47dc-aec5-a380b7484425" path="/var/lib/kubelet/pods/e78542dc-01da-47dc-aec5-a380b7484425/volumes" Feb 19 10:04:49 crc kubenswrapper[4873]: I0219 10:04:49.761375 4873 generic.go:334] "Generic (PLEG): container finished" podID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerID="bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde" exitCode=143 Feb 19 10:04:49 crc kubenswrapper[4873]: I0219 10:04:49.762277 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a513081d-764b-47d8-85d8-7019d7ea92ca","Type":"ContainerDied","Data":"bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde"} Feb 19 10:04:50 crc kubenswrapper[4873]: I0219 10:04:50.771488 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a77215ef-e683-4030-9b93-5f30814f1158" containerName="nova-metadata-metadata" containerID="cri-o://0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796" gracePeriod=30 Feb 19 10:04:50 crc kubenswrapper[4873]: I0219 10:04:50.771474 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a77215ef-e683-4030-9b93-5f30814f1158" containerName="nova-metadata-log" containerID="cri-o://db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3" gracePeriod=30 Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.105195 4873 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225 is running failed: container process not found" containerID="c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.105581 4873 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225 is running failed: container process not found" containerID="c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.105941 4873 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225 is running failed: container process not found" containerID="c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.106009 4873 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="caad2a71-260f-41e5-99f0-532d73995f41" containerName="nova-scheduler-scheduler" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.446540 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.453962 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.625508 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnrvh\" (UniqueName: \"kubernetes.io/projected/caad2a71-260f-41e5-99f0-532d73995f41-kube-api-access-jnrvh\") pod \"caad2a71-260f-41e5-99f0-532d73995f41\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.625911 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-config-data\") pod \"caad2a71-260f-41e5-99f0-532d73995f41\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.626075 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-combined-ca-bundle\") pod \"a77215ef-e683-4030-9b93-5f30814f1158\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.626216 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knkwd\" (UniqueName: \"kubernetes.io/projected/a77215ef-e683-4030-9b93-5f30814f1158-kube-api-access-knkwd\") pod \"a77215ef-e683-4030-9b93-5f30814f1158\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.626365 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-combined-ca-bundle\") pod \"caad2a71-260f-41e5-99f0-532d73995f41\" (UID: \"caad2a71-260f-41e5-99f0-532d73995f41\") " Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.626530 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a77215ef-e683-4030-9b93-5f30814f1158-logs\") pod \"a77215ef-e683-4030-9b93-5f30814f1158\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.626646 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-nova-metadata-tls-certs\") pod \"a77215ef-e683-4030-9b93-5f30814f1158\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.626775 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-config-data\") pod \"a77215ef-e683-4030-9b93-5f30814f1158\" (UID: \"a77215ef-e683-4030-9b93-5f30814f1158\") " Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.627206 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a77215ef-e683-4030-9b93-5f30814f1158-logs" (OuterVolumeSpecName: "logs") pod "a77215ef-e683-4030-9b93-5f30814f1158" (UID: "a77215ef-e683-4030-9b93-5f30814f1158"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.628186 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a77215ef-e683-4030-9b93-5f30814f1158-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.632345 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a77215ef-e683-4030-9b93-5f30814f1158-kube-api-access-knkwd" (OuterVolumeSpecName: "kube-api-access-knkwd") pod "a77215ef-e683-4030-9b93-5f30814f1158" (UID: "a77215ef-e683-4030-9b93-5f30814f1158"). InnerVolumeSpecName "kube-api-access-knkwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.632513 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caad2a71-260f-41e5-99f0-532d73995f41-kube-api-access-jnrvh" (OuterVolumeSpecName: "kube-api-access-jnrvh") pod "caad2a71-260f-41e5-99f0-532d73995f41" (UID: "caad2a71-260f-41e5-99f0-532d73995f41"). InnerVolumeSpecName "kube-api-access-jnrvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.655316 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-config-data" (OuterVolumeSpecName: "config-data") pod "caad2a71-260f-41e5-99f0-532d73995f41" (UID: "caad2a71-260f-41e5-99f0-532d73995f41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.658452 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a77215ef-e683-4030-9b93-5f30814f1158" (UID: "a77215ef-e683-4030-9b93-5f30814f1158"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.665728 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "caad2a71-260f-41e5-99f0-532d73995f41" (UID: "caad2a71-260f-41e5-99f0-532d73995f41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.669578 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-config-data" (OuterVolumeSpecName: "config-data") pod "a77215ef-e683-4030-9b93-5f30814f1158" (UID: "a77215ef-e683-4030-9b93-5f30814f1158"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.698298 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a77215ef-e683-4030-9b93-5f30814f1158" (UID: "a77215ef-e683-4030-9b93-5f30814f1158"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.730391 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnrvh\" (UniqueName: \"kubernetes.io/projected/caad2a71-260f-41e5-99f0-532d73995f41-kube-api-access-jnrvh\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.730427 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.730441 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.730454 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knkwd\" (UniqueName: \"kubernetes.io/projected/a77215ef-e683-4030-9b93-5f30814f1158-kube-api-access-knkwd\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.730466 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caad2a71-260f-41e5-99f0-532d73995f41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.730477 4873 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.730489 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a77215ef-e683-4030-9b93-5f30814f1158-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.782166 4873 generic.go:334] "Generic (PLEG): container finished" podID="caad2a71-260f-41e5-99f0-532d73995f41" containerID="c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225" exitCode=0 Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.782245 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"caad2a71-260f-41e5-99f0-532d73995f41","Type":"ContainerDied","Data":"c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225"} Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.782279 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"caad2a71-260f-41e5-99f0-532d73995f41","Type":"ContainerDied","Data":"6286e92557ca766dfcf83501ac2e45ef3faaab8e9e1c247dd572c05c5d0518c2"} Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.782300 4873 scope.go:117] "RemoveContainer" containerID="c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.782423 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.792970 4873 generic.go:334] "Generic (PLEG): container finished" podID="a77215ef-e683-4030-9b93-5f30814f1158" containerID="0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796" exitCode=0 Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.793008 4873 generic.go:334] "Generic (PLEG): container finished" podID="a77215ef-e683-4030-9b93-5f30814f1158" containerID="db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3" exitCode=143 Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.793035 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a77215ef-e683-4030-9b93-5f30814f1158","Type":"ContainerDied","Data":"0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796"} Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.793066 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a77215ef-e683-4030-9b93-5f30814f1158","Type":"ContainerDied","Data":"db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3"} Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.793080 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a77215ef-e683-4030-9b93-5f30814f1158","Type":"ContainerDied","Data":"ecd5ec7a0c35523bee927dcc16abb03decd8d6779d5d402a7592d31b466ab8a8"} Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.793193 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.820554 4873 scope.go:117] "RemoveContainer" containerID="c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225" Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.821342 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225\": container with ID starting with c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225 not found: ID does not exist" containerID="c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.821379 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225"} err="failed to get container status \"c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225\": rpc error: code = NotFound desc = could not find container \"c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225\": container with ID starting with c398c7bd4d96ee9c9aa015eb71d58185943f0386fe2115d9a2a7fe20b51f8225 not found: ID does not exist" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.821408 4873 scope.go:117] "RemoveContainer" containerID="0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.845459 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.861838 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.862443 4873 scope.go:117] "RemoveContainer" containerID="db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.899267 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.911049 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.918069 4873 scope.go:117] "RemoveContainer" containerID="0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796" Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.918548 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796\": container with ID starting with 0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796 not found: ID does not exist" containerID="0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.918579 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796"} err="failed to get container status \"0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796\": rpc error: code = NotFound desc = could not find container \"0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796\": container with ID starting with 0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796 not found: ID does not exist" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.918601 4873 scope.go:117] "RemoveContainer" containerID="db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3" Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.918997 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3\": container with ID starting with db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3 not found: ID does not exist" containerID="db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.919020 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3"} err="failed to get container status \"db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3\": rpc error: code = NotFound desc = could not find container \"db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3\": container with ID starting with db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3 not found: ID does not exist" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.919032 4873 scope.go:117] "RemoveContainer" containerID="0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.919247 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796"} err="failed to get container status \"0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796\": rpc error: code = NotFound desc = could not find container \"0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796\": container with ID starting with 0116364fa1af8724e348bd59034de2a4b4e700ed9792921c6c61a583dae2d796 not found: ID does not exist" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.919260 4873 scope.go:117] "RemoveContainer" containerID="db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.919783 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3"} err="failed to get container status \"db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3\": rpc error: code = NotFound desc = could not find container \"db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3\": container with ID starting with db7748b8444da78c681046bde136f47b14e67f4541a54edd5e6cba0baa6114e3 not found: ID does not exist" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.924319 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.924750 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b81c17-9130-4def-8021-e73168601bf6" containerName="nova-manage" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.924768 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b81c17-9130-4def-8021-e73168601bf6" containerName="nova-manage" Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.924785 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78542dc-01da-47dc-aec5-a380b7484425" containerName="dnsmasq-dns" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.924792 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78542dc-01da-47dc-aec5-a380b7484425" containerName="dnsmasq-dns" Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.924805 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78542dc-01da-47dc-aec5-a380b7484425" containerName="init" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.924812 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78542dc-01da-47dc-aec5-a380b7484425" containerName="init" Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.924838 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caad2a71-260f-41e5-99f0-532d73995f41" containerName="nova-scheduler-scheduler" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.924845 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="caad2a71-260f-41e5-99f0-532d73995f41" containerName="nova-scheduler-scheduler" Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.924882 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a77215ef-e683-4030-9b93-5f30814f1158" containerName="nova-metadata-metadata" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.924887 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77215ef-e683-4030-9b93-5f30814f1158" containerName="nova-metadata-metadata" Feb 19 10:04:51 crc kubenswrapper[4873]: E0219 10:04:51.924900 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a77215ef-e683-4030-9b93-5f30814f1158" containerName="nova-metadata-log" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.924906 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77215ef-e683-4030-9b93-5f30814f1158" containerName="nova-metadata-log" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.925081 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78542dc-01da-47dc-aec5-a380b7484425" containerName="dnsmasq-dns" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.925095 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="caad2a71-260f-41e5-99f0-532d73995f41" containerName="nova-scheduler-scheduler" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.925127 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a77215ef-e683-4030-9b93-5f30814f1158" containerName="nova-metadata-log" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.925137 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b81c17-9130-4def-8021-e73168601bf6" containerName="nova-manage" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.925185 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a77215ef-e683-4030-9b93-5f30814f1158" containerName="nova-metadata-metadata" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.932589 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.937855 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.977175 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.979394 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.981634 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.983143 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.985436 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:04:51 crc kubenswrapper[4873]: I0219 10:04:51.998410 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.054763 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4nvb\" (UniqueName: \"kubernetes.io/projected/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-kube-api-access-m4nvb\") pod \"nova-scheduler-0\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.057082 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.057163 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-config-data\") pod \"nova-scheduler-0\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.158884 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-config-data\") pod \"nova-scheduler-0\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.158963 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4nvb\" (UniqueName: \"kubernetes.io/projected/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-kube-api-access-m4nvb\") pod \"nova-scheduler-0\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.159008 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c5b09-1134-4319-890d-8d42e916fc4c-logs\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.159053 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.159159 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.159189 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttk6m\" (UniqueName: \"kubernetes.io/projected/ab0c5b09-1134-4319-890d-8d42e916fc4c-kube-api-access-ttk6m\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.159212 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-config-data\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.159371 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.168480 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.181293 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-config-data\") pod \"nova-scheduler-0\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.181626 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4nvb\" (UniqueName: \"kubernetes.io/projected/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-kube-api-access-m4nvb\") pod \"nova-scheduler-0\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " pod="openstack/nova-scheduler-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.261398 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c5b09-1134-4319-890d-8d42e916fc4c-logs\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.262024 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c5b09-1134-4319-890d-8d42e916fc4c-logs\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.262082 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.262307 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.262377 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttk6m\" (UniqueName: \"kubernetes.io/projected/ab0c5b09-1134-4319-890d-8d42e916fc4c-kube-api-access-ttk6m\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.262405 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-config-data\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.266667 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.266702 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-config-data\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.267043 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.290610 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttk6m\" (UniqueName: \"kubernetes.io/projected/ab0c5b09-1134-4319-890d-8d42e916fc4c-kube-api-access-ttk6m\") pod \"nova-metadata-0\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.329922 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.416028 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.426942 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.466208 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-config-data\") pod \"a513081d-764b-47d8-85d8-7019d7ea92ca\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.466300 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a513081d-764b-47d8-85d8-7019d7ea92ca-logs\") pod \"a513081d-764b-47d8-85d8-7019d7ea92ca\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.466339 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnrln\" (UniqueName: \"kubernetes.io/projected/a513081d-764b-47d8-85d8-7019d7ea92ca-kube-api-access-gnrln\") pod \"a513081d-764b-47d8-85d8-7019d7ea92ca\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.466605 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-combined-ca-bundle\") pod \"a513081d-764b-47d8-85d8-7019d7ea92ca\" (UID: \"a513081d-764b-47d8-85d8-7019d7ea92ca\") " Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.467297 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a513081d-764b-47d8-85d8-7019d7ea92ca-logs" (OuterVolumeSpecName: "logs") pod "a513081d-764b-47d8-85d8-7019d7ea92ca" (UID: "a513081d-764b-47d8-85d8-7019d7ea92ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.470867 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a513081d-764b-47d8-85d8-7019d7ea92ca-kube-api-access-gnrln" (OuterVolumeSpecName: "kube-api-access-gnrln") pod "a513081d-764b-47d8-85d8-7019d7ea92ca" (UID: "a513081d-764b-47d8-85d8-7019d7ea92ca"). InnerVolumeSpecName "kube-api-access-gnrln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.504896 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a513081d-764b-47d8-85d8-7019d7ea92ca" (UID: "a513081d-764b-47d8-85d8-7019d7ea92ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.530338 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-config-data" (OuterVolumeSpecName: "config-data") pod "a513081d-764b-47d8-85d8-7019d7ea92ca" (UID: "a513081d-764b-47d8-85d8-7019d7ea92ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.569486 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.569519 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a513081d-764b-47d8-85d8-7019d7ea92ca-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.569532 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a513081d-764b-47d8-85d8-7019d7ea92ca-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.569543 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnrln\" (UniqueName: \"kubernetes.io/projected/a513081d-764b-47d8-85d8-7019d7ea92ca-kube-api-access-gnrln\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.812271 4873 generic.go:334] "Generic (PLEG): container finished" podID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerID="b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49" exitCode=0 Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.812556 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a513081d-764b-47d8-85d8-7019d7ea92ca","Type":"ContainerDied","Data":"b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49"} Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.812586 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a513081d-764b-47d8-85d8-7019d7ea92ca","Type":"ContainerDied","Data":"d76a4cd46cc684d689557d9372222e68498dc6bce68b7bb1dbfe2b38a2ff9d6a"} Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.812607 4873 scope.go:117] "RemoveContainer" containerID="b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.812736 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.845359 4873 scope.go:117] "RemoveContainer" containerID="bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.856294 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.867573 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.877674 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 10:04:52 crc kubenswrapper[4873]: E0219 10:04:52.878225 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerName="nova-api-api" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.878253 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerName="nova-api-api" Feb 19 10:04:52 crc kubenswrapper[4873]: E0219 10:04:52.878280 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerName="nova-api-log" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.878289 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerName="nova-api-log" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.878560 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerName="nova-api-log" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.878592 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a513081d-764b-47d8-85d8-7019d7ea92ca" containerName="nova-api-api" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.879885 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.888170 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.893382 4873 scope.go:117] "RemoveContainer" containerID="b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.894031 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 10:04:52 crc kubenswrapper[4873]: E0219 10:04:52.896214 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49\": container with ID starting with b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49 not found: ID does not exist" containerID="b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.896264 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49"} err="failed to get container status \"b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49\": rpc error: code = NotFound desc = could not find container \"b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49\": container with ID starting with b1b21af5cb273a16647acb4511e7255326ce000c69525f549d69868584813f49 not found: ID does not exist" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.896305 4873 scope.go:117] "RemoveContainer" containerID="bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde" Feb 19 10:04:52 crc kubenswrapper[4873]: E0219 10:04:52.898339 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde\": container with ID starting with bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde not found: ID does not exist" containerID="bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.898380 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde"} err="failed to get container status \"bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde\": rpc error: code = NotFound desc = could not find container \"bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde\": container with ID starting with bed6311acd16b8356a4487f32ee541e984f5932f9c965fe2c30e30a6b4581fde not found: ID does not exist" Feb 19 10:04:52 crc kubenswrapper[4873]: W0219 10:04:52.911462 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8db35141_6a4c_41cb_8a70_c68ab32fb2fe.slice/crio-f055548cf56af3386b5caa98b53e876cf48bc03e71f26f63cd2f36d9a6f05688 WatchSource:0}: Error finding container f055548cf56af3386b5caa98b53e876cf48bc03e71f26f63cd2f36d9a6f05688: Status 404 returned error can't find the container with id f055548cf56af3386b5caa98b53e876cf48bc03e71f26f63cd2f36d9a6f05688 Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.925272 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.976983 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2e87055-b0d9-4e47-9e2d-db14987e29c1-logs\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.977051 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-config-data\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.977301 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbwjp\" (UniqueName: \"kubernetes.io/projected/d2e87055-b0d9-4e47-9e2d-db14987e29c1-kube-api-access-jbwjp\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.977522 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:52 crc kubenswrapper[4873]: I0219 10:04:52.987717 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:04:52 crc kubenswrapper[4873]: W0219 10:04:52.992309 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab0c5b09_1134_4319_890d_8d42e916fc4c.slice/crio-bccf19aed991c39d7abd6ccba3455f58690ec6cd1d9a81513e1e6040784b81a5 WatchSource:0}: Error finding container bccf19aed991c39d7abd6ccba3455f58690ec6cd1d9a81513e1e6040784b81a5: Status 404 returned error can't find the container with id bccf19aed991c39d7abd6ccba3455f58690ec6cd1d9a81513e1e6040784b81a5 Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.080134 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2e87055-b0d9-4e47-9e2d-db14987e29c1-logs\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.080187 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-config-data\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.080302 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbwjp\" (UniqueName: \"kubernetes.io/projected/d2e87055-b0d9-4e47-9e2d-db14987e29c1-kube-api-access-jbwjp\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.080606 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.081342 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2e87055-b0d9-4e47-9e2d-db14987e29c1-logs\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.085554 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-config-data\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.086049 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.102233 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbwjp\" (UniqueName: \"kubernetes.io/projected/d2e87055-b0d9-4e47-9e2d-db14987e29c1-kube-api-access-jbwjp\") pod \"nova-api-0\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " pod="openstack/nova-api-0" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.266144 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.497861 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a513081d-764b-47d8-85d8-7019d7ea92ca" path="/var/lib/kubelet/pods/a513081d-764b-47d8-85d8-7019d7ea92ca/volumes" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.498917 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a77215ef-e683-4030-9b93-5f30814f1158" path="/var/lib/kubelet/pods/a77215ef-e683-4030-9b93-5f30814f1158/volumes" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.499562 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caad2a71-260f-41e5-99f0-532d73995f41" path="/var/lib/kubelet/pods/caad2a71-260f-41e5-99f0-532d73995f41/volumes" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.729375 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:04:53 crc kubenswrapper[4873]: W0219 10:04:53.739989 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2e87055_b0d9_4e47_9e2d_db14987e29c1.slice/crio-d550b23e94f3932a98e323b206b2513b13546a69a9e2a91f44f15350f1bda5ab WatchSource:0}: Error finding container d550b23e94f3932a98e323b206b2513b13546a69a9e2a91f44f15350f1bda5ab: Status 404 returned error can't find the container with id d550b23e94f3932a98e323b206b2513b13546a69a9e2a91f44f15350f1bda5ab Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.836165 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8db35141-6a4c-41cb-8a70-c68ab32fb2fe","Type":"ContainerStarted","Data":"0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c"} Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.836208 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8db35141-6a4c-41cb-8a70-c68ab32fb2fe","Type":"ContainerStarted","Data":"f055548cf56af3386b5caa98b53e876cf48bc03e71f26f63cd2f36d9a6f05688"} Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.839058 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c5b09-1134-4319-890d-8d42e916fc4c","Type":"ContainerStarted","Data":"772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c"} Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.839116 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c5b09-1134-4319-890d-8d42e916fc4c","Type":"ContainerStarted","Data":"109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609"} Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.839126 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c5b09-1134-4319-890d-8d42e916fc4c","Type":"ContainerStarted","Data":"bccf19aed991c39d7abd6ccba3455f58690ec6cd1d9a81513e1e6040784b81a5"} Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.841726 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2e87055-b0d9-4e47-9e2d-db14987e29c1","Type":"ContainerStarted","Data":"d550b23e94f3932a98e323b206b2513b13546a69a9e2a91f44f15350f1bda5ab"} Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.861894 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.861871464 podStartE2EDuration="2.861871464s" podCreationTimestamp="2026-02-19 10:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:04:53.852321936 +0000 UTC m=+1203.141753574" watchObservedRunningTime="2026-02-19 10:04:53.861871464 +0000 UTC m=+1203.151303102" Feb 19 10:04:53 crc kubenswrapper[4873]: I0219 10:04:53.874921 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.87490061 podStartE2EDuration="2.87490061s" podCreationTimestamp="2026-02-19 10:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:04:53.871198478 +0000 UTC m=+1203.160630116" watchObservedRunningTime="2026-02-19 10:04:53.87490061 +0000 UTC m=+1203.164332248" Feb 19 10:04:54 crc kubenswrapper[4873]: I0219 10:04:54.857322 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2e87055-b0d9-4e47-9e2d-db14987e29c1","Type":"ContainerStarted","Data":"2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e"} Feb 19 10:04:54 crc kubenswrapper[4873]: I0219 10:04:54.858454 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2e87055-b0d9-4e47-9e2d-db14987e29c1","Type":"ContainerStarted","Data":"f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2"} Feb 19 10:04:54 crc kubenswrapper[4873]: I0219 10:04:54.860524 4873 generic.go:334] "Generic (PLEG): container finished" podID="96fca831-509a-4abd-bb7e-2c0f4704368b" containerID="343f5f5d97db66db1963a29b53ef93078842c1069756343de5aa869eb8885cd9" exitCode=0 Feb 19 10:04:54 crc kubenswrapper[4873]: I0219 10:04:54.860638 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9z5nq" event={"ID":"96fca831-509a-4abd-bb7e-2c0f4704368b","Type":"ContainerDied","Data":"343f5f5d97db66db1963a29b53ef93078842c1069756343de5aa869eb8885cd9"} Feb 19 10:04:54 crc kubenswrapper[4873]: I0219 10:04:54.908539 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.908511678 podStartE2EDuration="2.908511678s" podCreationTimestamp="2026-02-19 10:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:04:54.888168199 +0000 UTC m=+1204.177599857" watchObservedRunningTime="2026-02-19 10:04:54.908511678 +0000 UTC m=+1204.197943356" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.267047 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.271903 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-config-data\") pod \"96fca831-509a-4abd-bb7e-2c0f4704368b\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.271997 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d48kf\" (UniqueName: \"kubernetes.io/projected/96fca831-509a-4abd-bb7e-2c0f4704368b-kube-api-access-d48kf\") pod \"96fca831-509a-4abd-bb7e-2c0f4704368b\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.272113 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-scripts\") pod \"96fca831-509a-4abd-bb7e-2c0f4704368b\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.272203 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-combined-ca-bundle\") pod \"96fca831-509a-4abd-bb7e-2c0f4704368b\" (UID: \"96fca831-509a-4abd-bb7e-2c0f4704368b\") " Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.277934 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96fca831-509a-4abd-bb7e-2c0f4704368b-kube-api-access-d48kf" (OuterVolumeSpecName: "kube-api-access-d48kf") pod "96fca831-509a-4abd-bb7e-2c0f4704368b" (UID: "96fca831-509a-4abd-bb7e-2c0f4704368b"). InnerVolumeSpecName "kube-api-access-d48kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.280844 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-scripts" (OuterVolumeSpecName: "scripts") pod "96fca831-509a-4abd-bb7e-2c0f4704368b" (UID: "96fca831-509a-4abd-bb7e-2c0f4704368b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.310257 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-config-data" (OuterVolumeSpecName: "config-data") pod "96fca831-509a-4abd-bb7e-2c0f4704368b" (UID: "96fca831-509a-4abd-bb7e-2c0f4704368b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.312451 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96fca831-509a-4abd-bb7e-2c0f4704368b" (UID: "96fca831-509a-4abd-bb7e-2c0f4704368b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.373850 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.373893 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d48kf\" (UniqueName: \"kubernetes.io/projected/96fca831-509a-4abd-bb7e-2c0f4704368b-kube-api-access-d48kf\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.373907 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.373918 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fca831-509a-4abd-bb7e-2c0f4704368b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.884690 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9z5nq" event={"ID":"96fca831-509a-4abd-bb7e-2c0f4704368b","Type":"ContainerDied","Data":"fa528c871922eeb11614e9dc0af7c2459d78bffa25b2a2a86e1fa5e00eb6941c"} Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.884733 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa528c871922eeb11614e9dc0af7c2459d78bffa25b2a2a86e1fa5e00eb6941c" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.884781 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9z5nq" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.977467 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 10:04:56 crc kubenswrapper[4873]: E0219 10:04:56.977859 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fca831-509a-4abd-bb7e-2c0f4704368b" containerName="nova-cell1-conductor-db-sync" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.977877 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fca831-509a-4abd-bb7e-2c0f4704368b" containerName="nova-cell1-conductor-db-sync" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.978058 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fca831-509a-4abd-bb7e-2c0f4704368b" containerName="nova-cell1-conductor-db-sync" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.978688 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.982262 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.985993 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0688136a-f0b5-4a2a-8f08-9c99d9c3644c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0688136a-f0b5-4a2a-8f08-9c99d9c3644c\") " pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.986125 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0688136a-f0b5-4a2a-8f08-9c99d9c3644c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0688136a-f0b5-4a2a-8f08-9c99d9c3644c\") " pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.986184 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgf4k\" (UniqueName: \"kubernetes.io/projected/0688136a-f0b5-4a2a-8f08-9c99d9c3644c-kube-api-access-fgf4k\") pod \"nova-cell1-conductor-0\" (UID: \"0688136a-f0b5-4a2a-8f08-9c99d9c3644c\") " pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:56 crc kubenswrapper[4873]: I0219 10:04:56.993047 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.088316 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgf4k\" (UniqueName: \"kubernetes.io/projected/0688136a-f0b5-4a2a-8f08-9c99d9c3644c-kube-api-access-fgf4k\") pod \"nova-cell1-conductor-0\" (UID: \"0688136a-f0b5-4a2a-8f08-9c99d9c3644c\") " pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.088758 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0688136a-f0b5-4a2a-8f08-9c99d9c3644c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0688136a-f0b5-4a2a-8f08-9c99d9c3644c\") " pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.088943 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0688136a-f0b5-4a2a-8f08-9c99d9c3644c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0688136a-f0b5-4a2a-8f08-9c99d9c3644c\") " pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.096156 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0688136a-f0b5-4a2a-8f08-9c99d9c3644c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0688136a-f0b5-4a2a-8f08-9c99d9c3644c\") " pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.096735 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0688136a-f0b5-4a2a-8f08-9c99d9c3644c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0688136a-f0b5-4a2a-8f08-9c99d9c3644c\") " pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.105376 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgf4k\" (UniqueName: \"kubernetes.io/projected/0688136a-f0b5-4a2a-8f08-9c99d9c3644c-kube-api-access-fgf4k\") pod \"nova-cell1-conductor-0\" (UID: \"0688136a-f0b5-4a2a-8f08-9c99d9c3644c\") " pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.300878 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.420487 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.427273 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.427872 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.812667 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 10:04:57 crc kubenswrapper[4873]: W0219 10:04:57.813222 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0688136a_f0b5_4a2a_8f08_9c99d9c3644c.slice/crio-5c47344e502844640fdf428f96539358855e0d910280920c96a09e38b74656ca WatchSource:0}: Error finding container 5c47344e502844640fdf428f96539358855e0d910280920c96a09e38b74656ca: Status 404 returned error can't find the container with id 5c47344e502844640fdf428f96539358855e0d910280920c96a09e38b74656ca Feb 19 10:04:57 crc kubenswrapper[4873]: I0219 10:04:57.897025 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0688136a-f0b5-4a2a-8f08-9c99d9c3644c","Type":"ContainerStarted","Data":"5c47344e502844640fdf428f96539358855e0d910280920c96a09e38b74656ca"} Feb 19 10:04:58 crc kubenswrapper[4873]: I0219 10:04:58.910047 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0688136a-f0b5-4a2a-8f08-9c99d9c3644c","Type":"ContainerStarted","Data":"32a28b03f93d802720d86299515a8bd1793dbc1a6df341c0590c74dffa8f5aab"} Feb 19 10:04:58 crc kubenswrapper[4873]: I0219 10:04:58.910406 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 10:04:58 crc kubenswrapper[4873]: I0219 10:04:58.941156 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.941136976 podStartE2EDuration="2.941136976s" podCreationTimestamp="2026-02-19 10:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:04:58.92452411 +0000 UTC m=+1208.213955768" watchObservedRunningTime="2026-02-19 10:04:58.941136976 +0000 UTC m=+1208.230568614" Feb 19 10:04:59 crc kubenswrapper[4873]: I0219 10:04:59.631546 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 10:05:02 crc kubenswrapper[4873]: I0219 10:05:02.416857 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 10:05:02 crc kubenswrapper[4873]: I0219 10:05:02.428483 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 10:05:02 crc kubenswrapper[4873]: I0219 10:05:02.428541 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 10:05:02 crc kubenswrapper[4873]: I0219 10:05:02.443848 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 10:05:02 crc kubenswrapper[4873]: I0219 10:05:02.974433 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.266737 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.266788 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.289162 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.289440 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5224ec80-b354-467f-b660-2d22b9725be0" containerName="kube-state-metrics" containerID="cri-o://fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1" gracePeriod=30 Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.442274 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.442470 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.866267 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.959716 4873 generic.go:334] "Generic (PLEG): container finished" podID="5224ec80-b354-467f-b660-2d22b9725be0" containerID="fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1" exitCode=2 Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.959793 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.959833 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5224ec80-b354-467f-b660-2d22b9725be0","Type":"ContainerDied","Data":"fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1"} Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.959861 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5224ec80-b354-467f-b660-2d22b9725be0","Type":"ContainerDied","Data":"e0fab87f6d902a58d41b4b35cef6645c9197dee8f59fc04defe1aac4065e472b"} Feb 19 10:05:03 crc kubenswrapper[4873]: I0219 10:05:03.959877 4873 scope.go:117] "RemoveContainer" containerID="fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.000732 4873 scope.go:117] "RemoveContainer" containerID="fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1" Feb 19 10:05:04 crc kubenswrapper[4873]: E0219 10:05:04.002395 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1\": container with ID starting with fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1 not found: ID does not exist" containerID="fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.002450 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1"} err="failed to get container status \"fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1\": rpc error: code = NotFound desc = could not find container \"fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1\": container with ID starting with fff297dee24ce88f13715d9cd5435080bd7f84d8bee1670209f4a402d95507f1 not found: ID does not exist" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.022162 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fb6h\" (UniqueName: \"kubernetes.io/projected/5224ec80-b354-467f-b660-2d22b9725be0-kube-api-access-7fb6h\") pod \"5224ec80-b354-467f-b660-2d22b9725be0\" (UID: \"5224ec80-b354-467f-b660-2d22b9725be0\") " Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.031472 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5224ec80-b354-467f-b660-2d22b9725be0-kube-api-access-7fb6h" (OuterVolumeSpecName: "kube-api-access-7fb6h") pod "5224ec80-b354-467f-b660-2d22b9725be0" (UID: "5224ec80-b354-467f-b660-2d22b9725be0"). InnerVolumeSpecName "kube-api-access-7fb6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.125220 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fb6h\" (UniqueName: \"kubernetes.io/projected/5224ec80-b354-467f-b660-2d22b9725be0-kube-api-access-7fb6h\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.298131 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.310639 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.326187 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 10:05:04 crc kubenswrapper[4873]: E0219 10:05:04.327719 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5224ec80-b354-467f-b660-2d22b9725be0" containerName="kube-state-metrics" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.327849 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="5224ec80-b354-467f-b660-2d22b9725be0" containerName="kube-state-metrics" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.328799 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="5224ec80-b354-467f-b660-2d22b9725be0" containerName="kube-state-metrics" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.330077 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.343000 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.358052 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.361497 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.362069 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.378825 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.432316 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjf8\" (UniqueName: \"kubernetes.io/projected/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-kube-api-access-lqjf8\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.432367 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.432422 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.432462 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.534461 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjf8\" (UniqueName: \"kubernetes.io/projected/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-kube-api-access-lqjf8\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.534514 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.534589 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.534641 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.539774 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.550165 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.552073 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.553329 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjf8\" (UniqueName: \"kubernetes.io/projected/84c63c73-45f3-4d27-a3a3-cbfecd9e1810-kube-api-access-lqjf8\") pod \"kube-state-metrics-0\" (UID: \"84c63c73-45f3-4d27-a3a3-cbfecd9e1810\") " pod="openstack/kube-state-metrics-0" Feb 19 10:05:04 crc kubenswrapper[4873]: I0219 10:05:04.659625 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 10:05:05 crc kubenswrapper[4873]: I0219 10:05:05.126836 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 10:05:05 crc kubenswrapper[4873]: I0219 10:05:05.521698 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5224ec80-b354-467f-b660-2d22b9725be0" path="/var/lib/kubelet/pods/5224ec80-b354-467f-b660-2d22b9725be0/volumes" Feb 19 10:05:05 crc kubenswrapper[4873]: I0219 10:05:05.850851 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:05 crc kubenswrapper[4873]: I0219 10:05:05.851241 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="sg-core" containerID="cri-o://090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2" gracePeriod=30 Feb 19 10:05:05 crc kubenswrapper[4873]: I0219 10:05:05.851266 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="ceilometer-notification-agent" containerID="cri-o://6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3" gracePeriod=30 Feb 19 10:05:05 crc kubenswrapper[4873]: I0219 10:05:05.851240 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="proxy-httpd" containerID="cri-o://08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a" gracePeriod=30 Feb 19 10:05:05 crc kubenswrapper[4873]: I0219 10:05:05.851456 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="ceilometer-central-agent" containerID="cri-o://927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9" gracePeriod=30 Feb 19 10:05:05 crc kubenswrapper[4873]: I0219 10:05:05.980533 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"84c63c73-45f3-4d27-a3a3-cbfecd9e1810","Type":"ContainerStarted","Data":"8a16ce75f567cbe15430ec901a55481ae1f2669fb715f58f63a09854ac4b36a8"} Feb 19 10:05:06 crc kubenswrapper[4873]: I0219 10:05:06.993912 4873 generic.go:334] "Generic (PLEG): container finished" podID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerID="08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a" exitCode=0 Feb 19 10:05:06 crc kubenswrapper[4873]: I0219 10:05:06.993953 4873 generic.go:334] "Generic (PLEG): container finished" podID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerID="090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2" exitCode=2 Feb 19 10:05:06 crc kubenswrapper[4873]: I0219 10:05:06.993963 4873 generic.go:334] "Generic (PLEG): container finished" podID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerID="927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9" exitCode=0 Feb 19 10:05:06 crc kubenswrapper[4873]: I0219 10:05:06.993977 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaaed141-d989-4b6f-ad3b-aefe0952c823","Type":"ContainerDied","Data":"08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a"} Feb 19 10:05:06 crc kubenswrapper[4873]: I0219 10:05:06.994027 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaaed141-d989-4b6f-ad3b-aefe0952c823","Type":"ContainerDied","Data":"090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2"} Feb 19 10:05:06 crc kubenswrapper[4873]: I0219 10:05:06.994041 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaaed141-d989-4b6f-ad3b-aefe0952c823","Type":"ContainerDied","Data":"927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9"} Feb 19 10:05:06 crc kubenswrapper[4873]: I0219 10:05:06.995664 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"84c63c73-45f3-4d27-a3a3-cbfecd9e1810","Type":"ContainerStarted","Data":"e9c6bfb09d4c220b8db81ffbf2c1166fe1e83c7922f9e171dae791088134bef5"} Feb 19 10:05:06 crc kubenswrapper[4873]: I0219 10:05:06.996030 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 10:05:07 crc kubenswrapper[4873]: I0219 10:05:07.019138 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.464656175 podStartE2EDuration="3.01911733s" podCreationTimestamp="2026-02-19 10:05:04 +0000 UTC" firstStartedPulling="2026-02-19 10:05:05.124713554 +0000 UTC m=+1214.414145202" lastFinishedPulling="2026-02-19 10:05:06.679174719 +0000 UTC m=+1215.968606357" observedRunningTime="2026-02-19 10:05:07.018066084 +0000 UTC m=+1216.307497722" watchObservedRunningTime="2026-02-19 10:05:07.01911733 +0000 UTC m=+1216.308548978" Feb 19 10:05:07 crc kubenswrapper[4873]: I0219 10:05:07.336147 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.790378 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.934265 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-config-data\") pod \"aaaed141-d989-4b6f-ad3b-aefe0952c823\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.934342 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-log-httpd\") pod \"aaaed141-d989-4b6f-ad3b-aefe0952c823\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.934357 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-run-httpd\") pod \"aaaed141-d989-4b6f-ad3b-aefe0952c823\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.934389 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8fsk\" (UniqueName: \"kubernetes.io/projected/aaaed141-d989-4b6f-ad3b-aefe0952c823-kube-api-access-t8fsk\") pod \"aaaed141-d989-4b6f-ad3b-aefe0952c823\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.934451 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-scripts\") pod \"aaaed141-d989-4b6f-ad3b-aefe0952c823\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.934489 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-sg-core-conf-yaml\") pod \"aaaed141-d989-4b6f-ad3b-aefe0952c823\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.934577 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-combined-ca-bundle\") pod \"aaaed141-d989-4b6f-ad3b-aefe0952c823\" (UID: \"aaaed141-d989-4b6f-ad3b-aefe0952c823\") " Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.934683 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aaaed141-d989-4b6f-ad3b-aefe0952c823" (UID: "aaaed141-d989-4b6f-ad3b-aefe0952c823"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.934843 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aaaed141-d989-4b6f-ad3b-aefe0952c823" (UID: "aaaed141-d989-4b6f-ad3b-aefe0952c823"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.935483 4873 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.935515 4873 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aaaed141-d989-4b6f-ad3b-aefe0952c823-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.947748 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-scripts" (OuterVolumeSpecName: "scripts") pod "aaaed141-d989-4b6f-ad3b-aefe0952c823" (UID: "aaaed141-d989-4b6f-ad3b-aefe0952c823"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.947770 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaaed141-d989-4b6f-ad3b-aefe0952c823-kube-api-access-t8fsk" (OuterVolumeSpecName: "kube-api-access-t8fsk") pod "aaaed141-d989-4b6f-ad3b-aefe0952c823" (UID: "aaaed141-d989-4b6f-ad3b-aefe0952c823"). InnerVolumeSpecName "kube-api-access-t8fsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:05:08 crc kubenswrapper[4873]: I0219 10:05:08.961519 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aaaed141-d989-4b6f-ad3b-aefe0952c823" (UID: "aaaed141-d989-4b6f-ad3b-aefe0952c823"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.019018 4873 generic.go:334] "Generic (PLEG): container finished" podID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerID="6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3" exitCode=0 Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.019058 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaaed141-d989-4b6f-ad3b-aefe0952c823","Type":"ContainerDied","Data":"6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3"} Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.019082 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aaaed141-d989-4b6f-ad3b-aefe0952c823","Type":"ContainerDied","Data":"a10f8c50de08f100cd9a2d7823fcccf70bf26fcbe7aced261a97c5fa92b15f9b"} Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.019112 4873 scope.go:117] "RemoveContainer" containerID="08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.019233 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.023567 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaaed141-d989-4b6f-ad3b-aefe0952c823" (UID: "aaaed141-d989-4b6f-ad3b-aefe0952c823"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.039648 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8fsk\" (UniqueName: \"kubernetes.io/projected/aaaed141-d989-4b6f-ad3b-aefe0952c823-kube-api-access-t8fsk\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.039677 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.039686 4873 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.040778 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.042036 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-config-data" (OuterVolumeSpecName: "config-data") pod "aaaed141-d989-4b6f-ad3b-aefe0952c823" (UID: "aaaed141-d989-4b6f-ad3b-aefe0952c823"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.090708 4873 scope.go:117] "RemoveContainer" containerID="090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.126865 4873 scope.go:117] "RemoveContainer" containerID="6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.143169 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaaed141-d989-4b6f-ad3b-aefe0952c823-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.154853 4873 scope.go:117] "RemoveContainer" containerID="927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.175383 4873 scope.go:117] "RemoveContainer" containerID="08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a" Feb 19 10:05:09 crc kubenswrapper[4873]: E0219 10:05:09.175793 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a\": container with ID starting with 08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a not found: ID does not exist" containerID="08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.175834 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a"} err="failed to get container status \"08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a\": rpc error: code = NotFound desc = could not find container \"08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a\": container with ID starting with 08fd273370ab3603598065600f3bad166543e52db0dec0c4a5d168de0ad4347a not found: ID does not exist" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.175856 4873 scope.go:117] "RemoveContainer" containerID="090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2" Feb 19 10:05:09 crc kubenswrapper[4873]: E0219 10:05:09.176084 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2\": container with ID starting with 090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2 not found: ID does not exist" containerID="090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.176135 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2"} err="failed to get container status \"090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2\": rpc error: code = NotFound desc = could not find container \"090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2\": container with ID starting with 090a2539fe4dfb43105b70b0c0088e33de18f21996d9aa508978b202715db0e2 not found: ID does not exist" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.176155 4873 scope.go:117] "RemoveContainer" containerID="6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3" Feb 19 10:05:09 crc kubenswrapper[4873]: E0219 10:05:09.176431 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3\": container with ID starting with 6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3 not found: ID does not exist" containerID="6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.176459 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3"} err="failed to get container status \"6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3\": rpc error: code = NotFound desc = could not find container \"6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3\": container with ID starting with 6a6cd3259862f8346cd815b704d4c845d978984e7e26d6d40916439da8e278e3 not found: ID does not exist" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.176476 4873 scope.go:117] "RemoveContainer" containerID="927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9" Feb 19 10:05:09 crc kubenswrapper[4873]: E0219 10:05:09.176759 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9\": container with ID starting with 927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9 not found: ID does not exist" containerID="927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.176785 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9"} err="failed to get container status \"927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9\": rpc error: code = NotFound desc = could not find container \"927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9\": container with ID starting with 927d24780f007579d39d1cb2db522f7319926e96deda19a1ce5ec381489e91e9 not found: ID does not exist" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.359726 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.372367 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.425114 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:09 crc kubenswrapper[4873]: E0219 10:05:09.425497 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="ceilometer-central-agent" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.425515 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="ceilometer-central-agent" Feb 19 10:05:09 crc kubenswrapper[4873]: E0219 10:05:09.425540 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="sg-core" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.425547 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="sg-core" Feb 19 10:05:09 crc kubenswrapper[4873]: E0219 10:05:09.425567 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="ceilometer-notification-agent" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.425573 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="ceilometer-notification-agent" Feb 19 10:05:09 crc kubenswrapper[4873]: E0219 10:05:09.425589 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="proxy-httpd" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.425595 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="proxy-httpd" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.425818 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="sg-core" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.425835 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="proxy-httpd" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.425846 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="ceilometer-central-agent" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.425857 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" containerName="ceilometer-notification-agent" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.427615 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.431762 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.431966 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.432076 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.442371 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.494034 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaaed141-d989-4b6f-ad3b-aefe0952c823" path="/var/lib/kubelet/pods/aaaed141-d989-4b6f-ad3b-aefe0952c823/volumes" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.559143 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-log-httpd\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.559219 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-scripts\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.559258 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bj7n\" (UniqueName: \"kubernetes.io/projected/9c767ece-c345-4a24-93b3-3e7e3f662e0f-kube-api-access-6bj7n\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.559375 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.559427 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.559455 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-run-httpd\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.559474 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.559499 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-config-data\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.661458 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.661522 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.661559 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-run-httpd\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.661585 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.661611 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-config-data\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.661691 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-log-httpd\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.661729 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-scripts\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.661751 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bj7n\" (UniqueName: \"kubernetes.io/projected/9c767ece-c345-4a24-93b3-3e7e3f662e0f-kube-api-access-6bj7n\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.662536 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-log-httpd\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.662709 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-run-httpd\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.665279 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.665279 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.666127 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-config-data\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.666854 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-scripts\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.676698 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.679290 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bj7n\" (UniqueName: \"kubernetes.io/projected/9c767ece-c345-4a24-93b3-3e7e3f662e0f-kube-api-access-6bj7n\") pod \"ceilometer-0\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " pod="openstack/ceilometer-0" Feb 19 10:05:09 crc kubenswrapper[4873]: I0219 10:05:09.787527 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:05:10 crc kubenswrapper[4873]: I0219 10:05:10.247544 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:11 crc kubenswrapper[4873]: I0219 10:05:11.041042 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c767ece-c345-4a24-93b3-3e7e3f662e0f","Type":"ContainerStarted","Data":"e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2"} Feb 19 10:05:11 crc kubenswrapper[4873]: I0219 10:05:11.041383 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c767ece-c345-4a24-93b3-3e7e3f662e0f","Type":"ContainerStarted","Data":"8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b"} Feb 19 10:05:11 crc kubenswrapper[4873]: I0219 10:05:11.041395 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c767ece-c345-4a24-93b3-3e7e3f662e0f","Type":"ContainerStarted","Data":"807e7166281c5f8f2d5afe5dddfd4f72b55225d08b8ebed491be486ce864d054"} Feb 19 10:05:12 crc kubenswrapper[4873]: I0219 10:05:12.053362 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c767ece-c345-4a24-93b3-3e7e3f662e0f","Type":"ContainerStarted","Data":"d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4"} Feb 19 10:05:12 crc kubenswrapper[4873]: I0219 10:05:12.433740 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 10:05:12 crc kubenswrapper[4873]: I0219 10:05:12.435452 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 10:05:12 crc kubenswrapper[4873]: I0219 10:05:12.443505 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 10:05:12 crc kubenswrapper[4873]: I0219 10:05:12.933866 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.032697 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-combined-ca-bundle\") pod \"592b92b0-44a1-4386-8f2e-8a55633dedd8\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.033084 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcmjb\" (UniqueName: \"kubernetes.io/projected/592b92b0-44a1-4386-8f2e-8a55633dedd8-kube-api-access-fcmjb\") pod \"592b92b0-44a1-4386-8f2e-8a55633dedd8\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.033225 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-config-data\") pod \"592b92b0-44a1-4386-8f2e-8a55633dedd8\" (UID: \"592b92b0-44a1-4386-8f2e-8a55633dedd8\") " Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.038416 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/592b92b0-44a1-4386-8f2e-8a55633dedd8-kube-api-access-fcmjb" (OuterVolumeSpecName: "kube-api-access-fcmjb") pod "592b92b0-44a1-4386-8f2e-8a55633dedd8" (UID: "592b92b0-44a1-4386-8f2e-8a55633dedd8"). InnerVolumeSpecName "kube-api-access-fcmjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.067304 4873 generic.go:334] "Generic (PLEG): container finished" podID="592b92b0-44a1-4386-8f2e-8a55633dedd8" containerID="d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3" exitCode=137 Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.067493 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"592b92b0-44a1-4386-8f2e-8a55633dedd8","Type":"ContainerDied","Data":"d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3"} Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.067566 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"592b92b0-44a1-4386-8f2e-8a55633dedd8","Type":"ContainerDied","Data":"dd5cbf40dbe04c7e636d3d6dd6dd491656051f5351076f90a0afd9cb6afdd2b3"} Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.067592 4873 scope.go:117] "RemoveContainer" containerID="d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.069152 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.070750 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-config-data" (OuterVolumeSpecName: "config-data") pod "592b92b0-44a1-4386-8f2e-8a55633dedd8" (UID: "592b92b0-44a1-4386-8f2e-8a55633dedd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.083571 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.093604 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "592b92b0-44a1-4386-8f2e-8a55633dedd8" (UID: "592b92b0-44a1-4386-8f2e-8a55633dedd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.137639 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.137674 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcmjb\" (UniqueName: \"kubernetes.io/projected/592b92b0-44a1-4386-8f2e-8a55633dedd8-kube-api-access-fcmjb\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.137686 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592b92b0-44a1-4386-8f2e-8a55633dedd8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.174028 4873 scope.go:117] "RemoveContainer" containerID="d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3" Feb 19 10:05:13 crc kubenswrapper[4873]: E0219 10:05:13.174923 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3\": container with ID starting with d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3 not found: ID does not exist" containerID="d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.174969 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3"} err="failed to get container status \"d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3\": rpc error: code = NotFound desc = could not find container \"d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3\": container with ID starting with d989940ed8c94822b97ee63d283f9c02be1cea0db1e5579461b45d4f18b376b3 not found: ID does not exist" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.273812 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.275069 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.278562 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.281032 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.416996 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.439449 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.452117 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 10:05:13 crc kubenswrapper[4873]: E0219 10:05:13.452556 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="592b92b0-44a1-4386-8f2e-8a55633dedd8" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.452572 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="592b92b0-44a1-4386-8f2e-8a55633dedd8" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.452747 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="592b92b0-44a1-4386-8f2e-8a55633dedd8" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.453393 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.456495 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.456699 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.456755 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.460487 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.496064 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="592b92b0-44a1-4386-8f2e-8a55633dedd8" path="/var/lib/kubelet/pods/592b92b0-44a1-4386-8f2e-8a55633dedd8/volumes" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.545135 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.545300 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv8mw\" (UniqueName: \"kubernetes.io/projected/cf46452a-f49d-48ab-a235-9e96f89c931f-kube-api-access-nv8mw\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.545384 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.545662 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.545749 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.648250 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.648312 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.648364 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.648440 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv8mw\" (UniqueName: \"kubernetes.io/projected/cf46452a-f49d-48ab-a235-9e96f89c931f-kube-api-access-nv8mw\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.648518 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.652991 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.653647 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.653708 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.655671 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf46452a-f49d-48ab-a235-9e96f89c931f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.666180 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv8mw\" (UniqueName: \"kubernetes.io/projected/cf46452a-f49d-48ab-a235-9e96f89c931f-kube-api-access-nv8mw\") pod \"nova-cell1-novncproxy-0\" (UID: \"cf46452a-f49d-48ab-a235-9e96f89c931f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:13 crc kubenswrapper[4873]: I0219 10:05:13.790447 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.085337 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c767ece-c345-4a24-93b3-3e7e3f662e0f","Type":"ContainerStarted","Data":"281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a"} Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.086172 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.087629 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.095925 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.143346 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.093480112 podStartE2EDuration="5.143325072s" podCreationTimestamp="2026-02-19 10:05:09 +0000 UTC" firstStartedPulling="2026-02-19 10:05:10.2552757 +0000 UTC m=+1219.544707338" lastFinishedPulling="2026-02-19 10:05:13.30512066 +0000 UTC m=+1222.594552298" observedRunningTime="2026-02-19 10:05:14.115015714 +0000 UTC m=+1223.404447352" watchObservedRunningTime="2026-02-19 10:05:14.143325072 +0000 UTC m=+1223.432756710" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.294240 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78d65dbfc-jjvbb"] Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.296790 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.333871 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78d65dbfc-jjvbb"] Feb 19 10:05:14 crc kubenswrapper[4873]: W0219 10:05:14.335702 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf46452a_f49d_48ab_a235_9e96f89c931f.slice/crio-9418b54ccc328fe85e923e683a72e777da925d9c459b57bd508915bb11e8a16c WatchSource:0}: Error finding container 9418b54ccc328fe85e923e683a72e777da925d9c459b57bd508915bb11e8a16c: Status 404 returned error can't find the container with id 9418b54ccc328fe85e923e683a72e777da925d9c459b57bd508915bb11e8a16c Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.364210 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.469589 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-svc\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.469695 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-sb\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.469768 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-config\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.469833 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx9vs\" (UniqueName: \"kubernetes.io/projected/7fe74544-e8af-45bd-9193-2b247c5e002b-kube-api-access-zx9vs\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.469866 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-swift-storage-0\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.469892 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-nb\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.571752 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-config\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.571852 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx9vs\" (UniqueName: \"kubernetes.io/projected/7fe74544-e8af-45bd-9193-2b247c5e002b-kube-api-access-zx9vs\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.571881 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-swift-storage-0\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.571914 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-nb\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.572001 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-svc\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.572155 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-sb\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.573129 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-nb\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.573237 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-config\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.573897 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-sb\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.574270 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-swift-storage-0\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.574406 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-svc\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.594007 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx9vs\" (UniqueName: \"kubernetes.io/projected/7fe74544-e8af-45bd-9193-2b247c5e002b-kube-api-access-zx9vs\") pod \"dnsmasq-dns-78d65dbfc-jjvbb\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.623011 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:14 crc kubenswrapper[4873]: I0219 10:05:14.682546 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 10:05:15 crc kubenswrapper[4873]: I0219 10:05:15.095320 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cf46452a-f49d-48ab-a235-9e96f89c931f","Type":"ContainerStarted","Data":"38a104eb659238745f063c24e19ef9f6376fe94feac7415bb8767d07f2a7a77b"} Feb 19 10:05:15 crc kubenswrapper[4873]: I0219 10:05:15.095826 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cf46452a-f49d-48ab-a235-9e96f89c931f","Type":"ContainerStarted","Data":"9418b54ccc328fe85e923e683a72e777da925d9c459b57bd508915bb11e8a16c"} Feb 19 10:05:15 crc kubenswrapper[4873]: I0219 10:05:15.116368 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.1163519060000002 podStartE2EDuration="2.116351906s" podCreationTimestamp="2026-02-19 10:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:05:15.113937906 +0000 UTC m=+1224.403369544" watchObservedRunningTime="2026-02-19 10:05:15.116351906 +0000 UTC m=+1224.405783534" Feb 19 10:05:15 crc kubenswrapper[4873]: I0219 10:05:15.144148 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78d65dbfc-jjvbb"] Feb 19 10:05:15 crc kubenswrapper[4873]: W0219 10:05:15.148151 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fe74544_e8af_45bd_9193_2b247c5e002b.slice/crio-bc3e27b7897877cb015837d9764a384c7709da0f91ae1099d9eccef91911ed99 WatchSource:0}: Error finding container bc3e27b7897877cb015837d9764a384c7709da0f91ae1099d9eccef91911ed99: Status 404 returned error can't find the container with id bc3e27b7897877cb015837d9764a384c7709da0f91ae1099d9eccef91911ed99 Feb 19 10:05:16 crc kubenswrapper[4873]: I0219 10:05:16.104493 4873 generic.go:334] "Generic (PLEG): container finished" podID="7fe74544-e8af-45bd-9193-2b247c5e002b" containerID="3584138ebb0ff2c764fd4906203bf8dafbda732c59288ffd05b66db47c5cbd6d" exitCode=0 Feb 19 10:05:16 crc kubenswrapper[4873]: I0219 10:05:16.106384 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" event={"ID":"7fe74544-e8af-45bd-9193-2b247c5e002b","Type":"ContainerDied","Data":"3584138ebb0ff2c764fd4906203bf8dafbda732c59288ffd05b66db47c5cbd6d"} Feb 19 10:05:16 crc kubenswrapper[4873]: I0219 10:05:16.106417 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" event={"ID":"7fe74544-e8af-45bd-9193-2b247c5e002b","Type":"ContainerStarted","Data":"bc3e27b7897877cb015837d9764a384c7709da0f91ae1099d9eccef91911ed99"} Feb 19 10:05:16 crc kubenswrapper[4873]: I0219 10:05:16.529697 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:17 crc kubenswrapper[4873]: I0219 10:05:17.116589 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" event={"ID":"7fe74544-e8af-45bd-9193-2b247c5e002b","Type":"ContainerStarted","Data":"2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e"} Feb 19 10:05:17 crc kubenswrapper[4873]: I0219 10:05:17.116736 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerName="nova-api-log" containerID="cri-o://f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2" gracePeriod=30 Feb 19 10:05:17 crc kubenswrapper[4873]: I0219 10:05:17.116876 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerName="nova-api-api" containerID="cri-o://2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e" gracePeriod=30 Feb 19 10:05:17 crc kubenswrapper[4873]: I0219 10:05:17.196690 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" podStartSLOduration=3.19667071 podStartE2EDuration="3.19667071s" podCreationTimestamp="2026-02-19 10:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:05:17.174514776 +0000 UTC m=+1226.463946414" watchObservedRunningTime="2026-02-19 10:05:17.19667071 +0000 UTC m=+1226.486102348" Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.133488 4873 generic.go:334] "Generic (PLEG): container finished" podID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerID="f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2" exitCode=143 Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.134391 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2e87055-b0d9-4e47-9e2d-db14987e29c1","Type":"ContainerDied","Data":"f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2"} Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.134419 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.564507 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.564980 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="ceilometer-central-agent" containerID="cri-o://8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b" gracePeriod=30 Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.565028 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="sg-core" containerID="cri-o://d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4" gracePeriod=30 Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.565081 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="ceilometer-notification-agent" containerID="cri-o://e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2" gracePeriod=30 Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.565091 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="proxy-httpd" containerID="cri-o://281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a" gracePeriod=30 Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.792718 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.853428 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.987680 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2e87055-b0d9-4e47-9e2d-db14987e29c1-logs\") pod \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.987731 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbwjp\" (UniqueName: \"kubernetes.io/projected/d2e87055-b0d9-4e47-9e2d-db14987e29c1-kube-api-access-jbwjp\") pod \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.987835 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-config-data\") pod \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.987922 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-combined-ca-bundle\") pod \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\" (UID: \"d2e87055-b0d9-4e47-9e2d-db14987e29c1\") " Feb 19 10:05:18 crc kubenswrapper[4873]: I0219 10:05:18.994992 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2e87055-b0d9-4e47-9e2d-db14987e29c1-logs" (OuterVolumeSpecName: "logs") pod "d2e87055-b0d9-4e47-9e2d-db14987e29c1" (UID: "d2e87055-b0d9-4e47-9e2d-db14987e29c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:18.999340 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2e87055-b0d9-4e47-9e2d-db14987e29c1-kube-api-access-jbwjp" (OuterVolumeSpecName: "kube-api-access-jbwjp") pod "d2e87055-b0d9-4e47-9e2d-db14987e29c1" (UID: "d2e87055-b0d9-4e47-9e2d-db14987e29c1"). InnerVolumeSpecName "kube-api-access-jbwjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.035526 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2e87055-b0d9-4e47-9e2d-db14987e29c1" (UID: "d2e87055-b0d9-4e47-9e2d-db14987e29c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.045655 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-config-data" (OuterVolumeSpecName: "config-data") pod "d2e87055-b0d9-4e47-9e2d-db14987e29c1" (UID: "d2e87055-b0d9-4e47-9e2d-db14987e29c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.090023 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.090061 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e87055-b0d9-4e47-9e2d-db14987e29c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.090071 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2e87055-b0d9-4e47-9e2d-db14987e29c1-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.090079 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbwjp\" (UniqueName: \"kubernetes.io/projected/d2e87055-b0d9-4e47-9e2d-db14987e29c1-kube-api-access-jbwjp\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.145816 4873 generic.go:334] "Generic (PLEG): container finished" podID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerID="281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a" exitCode=0 Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.145865 4873 generic.go:334] "Generic (PLEG): container finished" podID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerID="d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4" exitCode=2 Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.145890 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c767ece-c345-4a24-93b3-3e7e3f662e0f","Type":"ContainerDied","Data":"281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a"} Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.145938 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c767ece-c345-4a24-93b3-3e7e3f662e0f","Type":"ContainerDied","Data":"d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4"} Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.148468 4873 generic.go:334] "Generic (PLEG): container finished" podID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerID="2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e" exitCode=0 Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.148551 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.148559 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2e87055-b0d9-4e47-9e2d-db14987e29c1","Type":"ContainerDied","Data":"2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e"} Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.148594 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2e87055-b0d9-4e47-9e2d-db14987e29c1","Type":"ContainerDied","Data":"d550b23e94f3932a98e323b206b2513b13546a69a9e2a91f44f15350f1bda5ab"} Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.148614 4873 scope.go:117] "RemoveContainer" containerID="2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.174130 4873 scope.go:117] "RemoveContainer" containerID="f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.193945 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.199068 4873 scope.go:117] "RemoveContainer" containerID="2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e" Feb 19 10:05:19 crc kubenswrapper[4873]: E0219 10:05:19.199558 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e\": container with ID starting with 2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e not found: ID does not exist" containerID="2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.199598 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e"} err="failed to get container status \"2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e\": rpc error: code = NotFound desc = could not find container \"2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e\": container with ID starting with 2ca3a3ad7f9ca37295ebb7a316ce7b89e3162d36b731b68e154c3c09cba2fb3e not found: ID does not exist" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.199624 4873 scope.go:117] "RemoveContainer" containerID="f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2" Feb 19 10:05:19 crc kubenswrapper[4873]: E0219 10:05:19.199917 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2\": container with ID starting with f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2 not found: ID does not exist" containerID="f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.199963 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2"} err="failed to get container status \"f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2\": rpc error: code = NotFound desc = could not find container \"f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2\": container with ID starting with f6c6a093d08b34db5340df77d6fafcc6cae16deeb5b24238c9aa4d6a0671d3c2 not found: ID does not exist" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.203549 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.225420 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:19 crc kubenswrapper[4873]: E0219 10:05:19.225867 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerName="nova-api-api" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.225888 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerName="nova-api-api" Feb 19 10:05:19 crc kubenswrapper[4873]: E0219 10:05:19.225926 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerName="nova-api-log" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.225935 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerName="nova-api-log" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.226241 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerName="nova-api-log" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.226264 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" containerName="nova-api-api" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.227410 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.230159 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.230380 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.230653 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.234311 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.395515 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.395605 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.395647 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.395775 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-config-data\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.395891 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-logs\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.396050 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjk8d\" (UniqueName: \"kubernetes.io/projected/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-kube-api-access-qjk8d\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.495974 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2e87055-b0d9-4e47-9e2d-db14987e29c1" path="/var/lib/kubelet/pods/d2e87055-b0d9-4e47-9e2d-db14987e29c1/volumes" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.497630 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-logs\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.497684 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjk8d\" (UniqueName: \"kubernetes.io/projected/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-kube-api-access-qjk8d\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.497742 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.497789 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.497819 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.497891 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-config-data\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.498071 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-logs\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.504299 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.504296 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-config-data\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.504443 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.504525 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-public-tls-certs\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.515363 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjk8d\" (UniqueName: \"kubernetes.io/projected/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-kube-api-access-qjk8d\") pod \"nova-api-0\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " pod="openstack/nova-api-0" Feb 19 10:05:19 crc kubenswrapper[4873]: I0219 10:05:19.545062 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.022594 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:20 crc kubenswrapper[4873]: W0219 10:05:20.025236 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod772434a9_08d5_499e_9ea1_e9ed0cc1e1b6.slice/crio-11443c61c72c773d1b9e2f4dbdc5cf11081029e2fa850f7683c10c10c8471883 WatchSource:0}: Error finding container 11443c61c72c773d1b9e2f4dbdc5cf11081029e2fa850f7683c10c10c8471883: Status 404 returned error can't find the container with id 11443c61c72c773d1b9e2f4dbdc5cf11081029e2fa850f7683c10c10c8471883 Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.159842 4873 generic.go:334] "Generic (PLEG): container finished" podID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerID="e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2" exitCode=0 Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.159912 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c767ece-c345-4a24-93b3-3e7e3f662e0f","Type":"ContainerDied","Data":"e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2"} Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.164620 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6","Type":"ContainerStarted","Data":"11443c61c72c773d1b9e2f4dbdc5cf11081029e2fa850f7683c10c10c8471883"} Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.674905 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.851514 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-ceilometer-tls-certs\") pod \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.851559 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-combined-ca-bundle\") pod \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.851670 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-run-httpd\") pod \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.851765 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-sg-core-conf-yaml\") pod \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.851799 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-scripts\") pod \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.851834 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-config-data\") pod \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.851876 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bj7n\" (UniqueName: \"kubernetes.io/projected/9c767ece-c345-4a24-93b3-3e7e3f662e0f-kube-api-access-6bj7n\") pod \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.851895 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-log-httpd\") pod \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\" (UID: \"9c767ece-c345-4a24-93b3-3e7e3f662e0f\") " Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.852599 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9c767ece-c345-4a24-93b3-3e7e3f662e0f" (UID: "9c767ece-c345-4a24-93b3-3e7e3f662e0f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.852705 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9c767ece-c345-4a24-93b3-3e7e3f662e0f" (UID: "9c767ece-c345-4a24-93b3-3e7e3f662e0f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.856692 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c767ece-c345-4a24-93b3-3e7e3f662e0f-kube-api-access-6bj7n" (OuterVolumeSpecName: "kube-api-access-6bj7n") pod "9c767ece-c345-4a24-93b3-3e7e3f662e0f" (UID: "9c767ece-c345-4a24-93b3-3e7e3f662e0f"). InnerVolumeSpecName "kube-api-access-6bj7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.869837 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-scripts" (OuterVolumeSpecName: "scripts") pod "9c767ece-c345-4a24-93b3-3e7e3f662e0f" (UID: "9c767ece-c345-4a24-93b3-3e7e3f662e0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.904931 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9c767ece-c345-4a24-93b3-3e7e3f662e0f" (UID: "9c767ece-c345-4a24-93b3-3e7e3f662e0f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.920398 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9c767ece-c345-4a24-93b3-3e7e3f662e0f" (UID: "9c767ece-c345-4a24-93b3-3e7e3f662e0f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.946376 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c767ece-c345-4a24-93b3-3e7e3f662e0f" (UID: "9c767ece-c345-4a24-93b3-3e7e3f662e0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.954448 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bj7n\" (UniqueName: \"kubernetes.io/projected/9c767ece-c345-4a24-93b3-3e7e3f662e0f-kube-api-access-6bj7n\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.954476 4873 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.954487 4873 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.954496 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.954505 4873 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c767ece-c345-4a24-93b3-3e7e3f662e0f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.954514 4873 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.954521 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:20 crc kubenswrapper[4873]: I0219 10:05:20.988607 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-config-data" (OuterVolumeSpecName: "config-data") pod "9c767ece-c345-4a24-93b3-3e7e3f662e0f" (UID: "9c767ece-c345-4a24-93b3-3e7e3f662e0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.056631 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c767ece-c345-4a24-93b3-3e7e3f662e0f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.179325 4873 generic.go:334] "Generic (PLEG): container finished" podID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerID="8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b" exitCode=0 Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.179440 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c767ece-c345-4a24-93b3-3e7e3f662e0f","Type":"ContainerDied","Data":"8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b"} Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.179507 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c767ece-c345-4a24-93b3-3e7e3f662e0f","Type":"ContainerDied","Data":"807e7166281c5f8f2d5afe5dddfd4f72b55225d08b8ebed491be486ce864d054"} Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.179477 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.179532 4873 scope.go:117] "RemoveContainer" containerID="281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.184597 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6","Type":"ContainerStarted","Data":"ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb"} Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.184633 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6","Type":"ContainerStarted","Data":"a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74"} Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.207574 4873 scope.go:117] "RemoveContainer" containerID="d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.236665 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.236648732 podStartE2EDuration="2.236648732s" podCreationTimestamp="2026-02-19 10:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:05:21.214351984 +0000 UTC m=+1230.503783622" watchObservedRunningTime="2026-02-19 10:05:21.236648732 +0000 UTC m=+1230.526080370" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.240520 4873 scope.go:117] "RemoveContainer" containerID="e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.249194 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.262517 4873 scope.go:117] "RemoveContainer" containerID="8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.264987 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.277562 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:21 crc kubenswrapper[4873]: E0219 10:05:21.278027 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="ceilometer-central-agent" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.278047 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="ceilometer-central-agent" Feb 19 10:05:21 crc kubenswrapper[4873]: E0219 10:05:21.278058 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="ceilometer-notification-agent" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.278064 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="ceilometer-notification-agent" Feb 19 10:05:21 crc kubenswrapper[4873]: E0219 10:05:21.278076 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="proxy-httpd" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.278083 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="proxy-httpd" Feb 19 10:05:21 crc kubenswrapper[4873]: E0219 10:05:21.278118 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="sg-core" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.278125 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="sg-core" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.278329 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="sg-core" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.278354 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="proxy-httpd" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.278371 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="ceilometer-notification-agent" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.278384 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" containerName="ceilometer-central-agent" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.280438 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.283511 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.284017 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.284263 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.285588 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.298584 4873 scope.go:117] "RemoveContainer" containerID="281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a" Feb 19 10:05:21 crc kubenswrapper[4873]: E0219 10:05:21.300828 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a\": container with ID starting with 281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a not found: ID does not exist" containerID="281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.300954 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a"} err="failed to get container status \"281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a\": rpc error: code = NotFound desc = could not find container \"281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a\": container with ID starting with 281639fdf7e8eb6f014e8f87ef246ca8732e03adc6b6d58aee5acdcf818ca43a not found: ID does not exist" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.301082 4873 scope.go:117] "RemoveContainer" containerID="d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4" Feb 19 10:05:21 crc kubenswrapper[4873]: E0219 10:05:21.301571 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4\": container with ID starting with d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4 not found: ID does not exist" containerID="d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.301692 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4"} err="failed to get container status \"d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4\": rpc error: code = NotFound desc = could not find container \"d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4\": container with ID starting with d7bfa040f5b56903732b37fc818d6022980d779f6d081559594788050a432fc4 not found: ID does not exist" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.301801 4873 scope.go:117] "RemoveContainer" containerID="e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2" Feb 19 10:05:21 crc kubenswrapper[4873]: E0219 10:05:21.302958 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2\": container with ID starting with e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2 not found: ID does not exist" containerID="e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.302993 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2"} err="failed to get container status \"e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2\": rpc error: code = NotFound desc = could not find container \"e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2\": container with ID starting with e233046154b94937075780e18db17e39bb296d5ab4b60baa69e90a853f3a5ed2 not found: ID does not exist" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.303014 4873 scope.go:117] "RemoveContainer" containerID="8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b" Feb 19 10:05:21 crc kubenswrapper[4873]: E0219 10:05:21.303310 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b\": container with ID starting with 8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b not found: ID does not exist" containerID="8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.303332 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b"} err="failed to get container status \"8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b\": rpc error: code = NotFound desc = could not find container \"8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b\": container with ID starting with 8a41ac2def08f14d8da4168d8f19aa2fabc8acb436044297b419855d749d3f0b not found: ID does not exist" Feb 19 10:05:21 crc kubenswrapper[4873]: E0219 10:05:21.338969 4873 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c767ece_c345_4a24_93b3_3e7e3f662e0f.slice/crio-807e7166281c5f8f2d5afe5dddfd4f72b55225d08b8ebed491be486ce864d054\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c767ece_c345_4a24_93b3_3e7e3f662e0f.slice\": RecentStats: unable to find data in memory cache]" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.464308 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.464369 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-config-data\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.464401 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.464499 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-scripts\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.464591 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsnpw\" (UniqueName: \"kubernetes.io/projected/e432fa6f-daf1-4f3a-9f84-ac9495956013-kube-api-access-zsnpw\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.464658 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e432fa6f-daf1-4f3a-9f84-ac9495956013-run-httpd\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.464701 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.464814 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e432fa6f-daf1-4f3a-9f84-ac9495956013-log-httpd\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.495618 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c767ece-c345-4a24-93b3-3e7e3f662e0f" path="/var/lib/kubelet/pods/9c767ece-c345-4a24-93b3-3e7e3f662e0f/volumes" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.566713 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.567012 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-config-data\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.567154 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.567296 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-scripts\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.567473 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsnpw\" (UniqueName: \"kubernetes.io/projected/e432fa6f-daf1-4f3a-9f84-ac9495956013-kube-api-access-zsnpw\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.567615 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e432fa6f-daf1-4f3a-9f84-ac9495956013-run-httpd\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.567737 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.567919 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e432fa6f-daf1-4f3a-9f84-ac9495956013-log-httpd\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.568062 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e432fa6f-daf1-4f3a-9f84-ac9495956013-run-httpd\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.568294 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e432fa6f-daf1-4f3a-9f84-ac9495956013-log-httpd\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.572650 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-scripts\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.573092 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.573334 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.576564 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-config-data\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.585657 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e432fa6f-daf1-4f3a-9f84-ac9495956013-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.590418 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsnpw\" (UniqueName: \"kubernetes.io/projected/e432fa6f-daf1-4f3a-9f84-ac9495956013-kube-api-access-zsnpw\") pod \"ceilometer-0\" (UID: \"e432fa6f-daf1-4f3a-9f84-ac9495956013\") " pod="openstack/ceilometer-0" Feb 19 10:05:21 crc kubenswrapper[4873]: I0219 10:05:21.611675 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 10:05:22 crc kubenswrapper[4873]: I0219 10:05:22.055250 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 10:05:22 crc kubenswrapper[4873]: I0219 10:05:22.195371 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e432fa6f-daf1-4f3a-9f84-ac9495956013","Type":"ContainerStarted","Data":"32f14b35e516cbeb39793d746a4e9965aee26efb297a23939ab0efa460eb0612"} Feb 19 10:05:23 crc kubenswrapper[4873]: I0219 10:05:23.217176 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e432fa6f-daf1-4f3a-9f84-ac9495956013","Type":"ContainerStarted","Data":"2c9d998caf8ae31c8232e87ab3fa1eec56a6bad66d729371c7c83b4330762ea7"} Feb 19 10:05:23 crc kubenswrapper[4873]: I0219 10:05:23.217568 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e432fa6f-daf1-4f3a-9f84-ac9495956013","Type":"ContainerStarted","Data":"8e11799af3f86e6caa6a0799f6934ce14cb9f68c5589900c4a2b781684607904"} Feb 19 10:05:23 crc kubenswrapper[4873]: I0219 10:05:23.791169 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:23 crc kubenswrapper[4873]: I0219 10:05:23.812849 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.228714 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e432fa6f-daf1-4f3a-9f84-ac9495956013","Type":"ContainerStarted","Data":"847a37be053bbe33f84e36493db72679bb67bf08105ea8c34f1e3410b1da9a52"} Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.248218 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.408597 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ljn4d"] Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.410020 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.412468 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.413291 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.421452 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ljn4d"] Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.525742 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-scripts\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.526225 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-config-data\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.526391 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.526673 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fnt9\" (UniqueName: \"kubernetes.io/projected/355c3bd2-5fb4-4a28-be15-e766b61eeed9-kube-api-access-2fnt9\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.624347 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.628203 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-scripts\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.628283 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-config-data\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.628318 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.628394 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fnt9\" (UniqueName: \"kubernetes.io/projected/355c3bd2-5fb4-4a28-be15-e766b61eeed9-kube-api-access-2fnt9\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.634574 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-scripts\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.634838 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.635917 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-config-data\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.662306 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fnt9\" (UniqueName: \"kubernetes.io/projected/355c3bd2-5fb4-4a28-be15-e766b61eeed9-kube-api-access-2fnt9\") pod \"nova-cell1-cell-mapping-ljn4d\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.700863 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59574c798f-md9g4"] Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.701202 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59574c798f-md9g4" podUID="561650f5-0705-4bab-903d-66bba11301ce" containerName="dnsmasq-dns" containerID="cri-o://5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548" gracePeriod=10 Feb 19 10:05:24 crc kubenswrapper[4873]: I0219 10:05:24.737716 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.235328 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.246369 4873 generic.go:334] "Generic (PLEG): container finished" podID="561650f5-0705-4bab-903d-66bba11301ce" containerID="5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548" exitCode=0 Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.246485 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59574c798f-md9g4" event={"ID":"561650f5-0705-4bab-903d-66bba11301ce","Type":"ContainerDied","Data":"5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548"} Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.246536 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59574c798f-md9g4" event={"ID":"561650f5-0705-4bab-903d-66bba11301ce","Type":"ContainerDied","Data":"36949a05a228a205fbf13f7609b5591909a30f0981a5fef6bf17ca7a531f1283"} Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.246563 4873 scope.go:117] "RemoveContainer" containerID="5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.246623 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59574c798f-md9g4" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.251538 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e432fa6f-daf1-4f3a-9f84-ac9495956013","Type":"ContainerStarted","Data":"85324d9429eb66e2f90861b513b4788e18de0a88a406595d3735ff1130f70689"} Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.279793 4873 scope.go:117] "RemoveContainer" containerID="0f9b9b7ee4ce408ec602f5938ab243185a9595459642fa3aa668d6fc66f1980f" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.319980 4873 scope.go:117] "RemoveContainer" containerID="5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548" Feb 19 10:05:25 crc kubenswrapper[4873]: E0219 10:05:25.320676 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548\": container with ID starting with 5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548 not found: ID does not exist" containerID="5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.320722 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548"} err="failed to get container status \"5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548\": rpc error: code = NotFound desc = could not find container \"5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548\": container with ID starting with 5fbace2c800a52592460c743049ca978bd11128cb3460f233a4ff016b5877548 not found: ID does not exist" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.320745 4873 scope.go:117] "RemoveContainer" containerID="0f9b9b7ee4ce408ec602f5938ab243185a9595459642fa3aa668d6fc66f1980f" Feb 19 10:05:25 crc kubenswrapper[4873]: E0219 10:05:25.321194 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9b9b7ee4ce408ec602f5938ab243185a9595459642fa3aa668d6fc66f1980f\": container with ID starting with 0f9b9b7ee4ce408ec602f5938ab243185a9595459642fa3aa668d6fc66f1980f not found: ID does not exist" containerID="0f9b9b7ee4ce408ec602f5938ab243185a9595459642fa3aa668d6fc66f1980f" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.321210 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9b9b7ee4ce408ec602f5938ab243185a9595459642fa3aa668d6fc66f1980f"} err="failed to get container status \"0f9b9b7ee4ce408ec602f5938ab243185a9595459642fa3aa668d6fc66f1980f\": rpc error: code = NotFound desc = could not find container \"0f9b9b7ee4ce408ec602f5938ab243185a9595459642fa3aa668d6fc66f1980f\": container with ID starting with 0f9b9b7ee4ce408ec602f5938ab243185a9595459642fa3aa668d6fc66f1980f not found: ID does not exist" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.356686 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-nb\") pod \"561650f5-0705-4bab-903d-66bba11301ce\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.356773 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-config\") pod \"561650f5-0705-4bab-903d-66bba11301ce\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.356910 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmmp6\" (UniqueName: \"kubernetes.io/projected/561650f5-0705-4bab-903d-66bba11301ce-kube-api-access-wmmp6\") pod \"561650f5-0705-4bab-903d-66bba11301ce\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.356935 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-swift-storage-0\") pod \"561650f5-0705-4bab-903d-66bba11301ce\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.356959 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-sb\") pod \"561650f5-0705-4bab-903d-66bba11301ce\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.356986 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-svc\") pod \"561650f5-0705-4bab-903d-66bba11301ce\" (UID: \"561650f5-0705-4bab-903d-66bba11301ce\") " Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.365426 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561650f5-0705-4bab-903d-66bba11301ce-kube-api-access-wmmp6" (OuterVolumeSpecName: "kube-api-access-wmmp6") pod "561650f5-0705-4bab-903d-66bba11301ce" (UID: "561650f5-0705-4bab-903d-66bba11301ce"). InnerVolumeSpecName "kube-api-access-wmmp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.412066 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "561650f5-0705-4bab-903d-66bba11301ce" (UID: "561650f5-0705-4bab-903d-66bba11301ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.417731 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "561650f5-0705-4bab-903d-66bba11301ce" (UID: "561650f5-0705-4bab-903d-66bba11301ce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.421526 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "561650f5-0705-4bab-903d-66bba11301ce" (UID: "561650f5-0705-4bab-903d-66bba11301ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.423750 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-config" (OuterVolumeSpecName: "config") pod "561650f5-0705-4bab-903d-66bba11301ce" (UID: "561650f5-0705-4bab-903d-66bba11301ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.425091 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ljn4d"] Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.444507 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "561650f5-0705-4bab-903d-66bba11301ce" (UID: "561650f5-0705-4bab-903d-66bba11301ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.459396 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.459428 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.459440 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmmp6\" (UniqueName: \"kubernetes.io/projected/561650f5-0705-4bab-903d-66bba11301ce-kube-api-access-wmmp6\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.459455 4873 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.459464 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.459471 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/561650f5-0705-4bab-903d-66bba11301ce-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.581819 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59574c798f-md9g4"] Feb 19 10:05:25 crc kubenswrapper[4873]: I0219 10:05:25.592721 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59574c798f-md9g4"] Feb 19 10:05:26 crc kubenswrapper[4873]: I0219 10:05:26.275034 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ljn4d" event={"ID":"355c3bd2-5fb4-4a28-be15-e766b61eeed9","Type":"ContainerStarted","Data":"087819f431db46b81166e897b21b99194aa2c81f307651c133879685fcb5a03d"} Feb 19 10:05:26 crc kubenswrapper[4873]: I0219 10:05:26.275097 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ljn4d" event={"ID":"355c3bd2-5fb4-4a28-be15-e766b61eeed9","Type":"ContainerStarted","Data":"f92a5b55c4afad8b7be3982a8f532ea10a49853d42efb23114e4c171eee5c339"} Feb 19 10:05:26 crc kubenswrapper[4873]: I0219 10:05:26.277558 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 10:05:26 crc kubenswrapper[4873]: I0219 10:05:26.299620 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ljn4d" podStartSLOduration=2.299601247 podStartE2EDuration="2.299601247s" podCreationTimestamp="2026-02-19 10:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:05:26.295303501 +0000 UTC m=+1235.584735139" watchObservedRunningTime="2026-02-19 10:05:26.299601247 +0000 UTC m=+1235.589032895" Feb 19 10:05:26 crc kubenswrapper[4873]: I0219 10:05:26.321233 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.509697875 podStartE2EDuration="5.321210833s" podCreationTimestamp="2026-02-19 10:05:21 +0000 UTC" firstStartedPulling="2026-02-19 10:05:22.061004541 +0000 UTC m=+1231.350436179" lastFinishedPulling="2026-02-19 10:05:24.872517499 +0000 UTC m=+1234.161949137" observedRunningTime="2026-02-19 10:05:26.313472661 +0000 UTC m=+1235.602904299" watchObservedRunningTime="2026-02-19 10:05:26.321210833 +0000 UTC m=+1235.610642471" Feb 19 10:05:27 crc kubenswrapper[4873]: I0219 10:05:27.497750 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561650f5-0705-4bab-903d-66bba11301ce" path="/var/lib/kubelet/pods/561650f5-0705-4bab-903d-66bba11301ce/volumes" Feb 19 10:05:29 crc kubenswrapper[4873]: I0219 10:05:29.545452 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 10:05:29 crc kubenswrapper[4873]: I0219 10:05:29.545770 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 10:05:30 crc kubenswrapper[4873]: I0219 10:05:30.556245 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 10:05:30 crc kubenswrapper[4873]: I0219 10:05:30.556291 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 10:05:31 crc kubenswrapper[4873]: I0219 10:05:31.325119 4873 generic.go:334] "Generic (PLEG): container finished" podID="355c3bd2-5fb4-4a28-be15-e766b61eeed9" containerID="087819f431db46b81166e897b21b99194aa2c81f307651c133879685fcb5a03d" exitCode=0 Feb 19 10:05:31 crc kubenswrapper[4873]: I0219 10:05:31.325200 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ljn4d" event={"ID":"355c3bd2-5fb4-4a28-be15-e766b61eeed9","Type":"ContainerDied","Data":"087819f431db46b81166e897b21b99194aa2c81f307651c133879685fcb5a03d"} Feb 19 10:05:32 crc kubenswrapper[4873]: I0219 10:05:32.787355 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:32 crc kubenswrapper[4873]: I0219 10:05:32.913958 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-scripts\") pod \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " Feb 19 10:05:32 crc kubenswrapper[4873]: I0219 10:05:32.914050 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fnt9\" (UniqueName: \"kubernetes.io/projected/355c3bd2-5fb4-4a28-be15-e766b61eeed9-kube-api-access-2fnt9\") pod \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " Feb 19 10:05:32 crc kubenswrapper[4873]: I0219 10:05:32.914078 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-config-data\") pod \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " Feb 19 10:05:32 crc kubenswrapper[4873]: I0219 10:05:32.914167 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-combined-ca-bundle\") pod \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\" (UID: \"355c3bd2-5fb4-4a28-be15-e766b61eeed9\") " Feb 19 10:05:32 crc kubenswrapper[4873]: I0219 10:05:32.919728 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355c3bd2-5fb4-4a28-be15-e766b61eeed9-kube-api-access-2fnt9" (OuterVolumeSpecName: "kube-api-access-2fnt9") pod "355c3bd2-5fb4-4a28-be15-e766b61eeed9" (UID: "355c3bd2-5fb4-4a28-be15-e766b61eeed9"). InnerVolumeSpecName "kube-api-access-2fnt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:05:32 crc kubenswrapper[4873]: I0219 10:05:32.919830 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-scripts" (OuterVolumeSpecName: "scripts") pod "355c3bd2-5fb4-4a28-be15-e766b61eeed9" (UID: "355c3bd2-5fb4-4a28-be15-e766b61eeed9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:32 crc kubenswrapper[4873]: I0219 10:05:32.949332 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "355c3bd2-5fb4-4a28-be15-e766b61eeed9" (UID: "355c3bd2-5fb4-4a28-be15-e766b61eeed9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:32 crc kubenswrapper[4873]: I0219 10:05:32.953711 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-config-data" (OuterVolumeSpecName: "config-data") pod "355c3bd2-5fb4-4a28-be15-e766b61eeed9" (UID: "355c3bd2-5fb4-4a28-be15-e766b61eeed9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.016800 4873 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.016864 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fnt9\" (UniqueName: \"kubernetes.io/projected/355c3bd2-5fb4-4a28-be15-e766b61eeed9-kube-api-access-2fnt9\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.016880 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.016892 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355c3bd2-5fb4-4a28-be15-e766b61eeed9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.347663 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ljn4d" event={"ID":"355c3bd2-5fb4-4a28-be15-e766b61eeed9","Type":"ContainerDied","Data":"f92a5b55c4afad8b7be3982a8f532ea10a49853d42efb23114e4c171eee5c339"} Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.347715 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f92a5b55c4afad8b7be3982a8f532ea10a49853d42efb23114e4c171eee5c339" Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.347732 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ljn4d" Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.519373 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.519655 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerName="nova-api-log" containerID="cri-o://a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74" gracePeriod=30 Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.519750 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerName="nova-api-api" containerID="cri-o://ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb" gracePeriod=30 Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.544700 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.544981 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerName="nova-metadata-log" containerID="cri-o://109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609" gracePeriod=30 Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.545051 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerName="nova-metadata-metadata" containerID="cri-o://772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c" gracePeriod=30 Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.558614 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:05:33 crc kubenswrapper[4873]: I0219 10:05:33.558898 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8db35141-6a4c-41cb-8a70-c68ab32fb2fe" containerName="nova-scheduler-scheduler" containerID="cri-o://0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c" gracePeriod=30 Feb 19 10:05:34 crc kubenswrapper[4873]: I0219 10:05:34.356527 4873 generic.go:334] "Generic (PLEG): container finished" podID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerID="109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609" exitCode=143 Feb 19 10:05:34 crc kubenswrapper[4873]: I0219 10:05:34.356590 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c5b09-1134-4319-890d-8d42e916fc4c","Type":"ContainerDied","Data":"109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609"} Feb 19 10:05:34 crc kubenswrapper[4873]: I0219 10:05:34.358548 4873 generic.go:334] "Generic (PLEG): container finished" podID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerID="a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74" exitCode=143 Feb 19 10:05:34 crc kubenswrapper[4873]: I0219 10:05:34.358579 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6","Type":"ContainerDied","Data":"a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74"} Feb 19 10:05:34 crc kubenswrapper[4873]: I0219 10:05:34.986730 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:05:34 crc kubenswrapper[4873]: I0219 10:05:34.995277 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.070887 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-logs\") pod \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.070961 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttk6m\" (UniqueName: \"kubernetes.io/projected/ab0c5b09-1134-4319-890d-8d42e916fc4c-kube-api-access-ttk6m\") pod \"ab0c5b09-1134-4319-890d-8d42e916fc4c\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.071035 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-config-data\") pod \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.071058 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjk8d\" (UniqueName: \"kubernetes.io/projected/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-kube-api-access-qjk8d\") pod \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.071116 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-internal-tls-certs\") pod \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.071178 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-nova-metadata-tls-certs\") pod \"ab0c5b09-1134-4319-890d-8d42e916fc4c\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.071541 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-logs" (OuterVolumeSpecName: "logs") pod "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" (UID: "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.072047 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-combined-ca-bundle\") pod \"ab0c5b09-1134-4319-890d-8d42e916fc4c\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.072076 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-public-tls-certs\") pod \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.072127 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c5b09-1134-4319-890d-8d42e916fc4c-logs\") pod \"ab0c5b09-1134-4319-890d-8d42e916fc4c\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.072191 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-combined-ca-bundle\") pod \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\" (UID: \"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6\") " Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.072242 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-config-data\") pod \"ab0c5b09-1134-4319-890d-8d42e916fc4c\" (UID: \"ab0c5b09-1134-4319-890d-8d42e916fc4c\") " Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.072728 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.073817 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab0c5b09-1134-4319-890d-8d42e916fc4c-logs" (OuterVolumeSpecName: "logs") pod "ab0c5b09-1134-4319-890d-8d42e916fc4c" (UID: "ab0c5b09-1134-4319-890d-8d42e916fc4c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.078208 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-kube-api-access-qjk8d" (OuterVolumeSpecName: "kube-api-access-qjk8d") pod "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" (UID: "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6"). InnerVolumeSpecName "kube-api-access-qjk8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.095462 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0c5b09-1134-4319-890d-8d42e916fc4c-kube-api-access-ttk6m" (OuterVolumeSpecName: "kube-api-access-ttk6m") pod "ab0c5b09-1134-4319-890d-8d42e916fc4c" (UID: "ab0c5b09-1134-4319-890d-8d42e916fc4c"). InnerVolumeSpecName "kube-api-access-ttk6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.106249 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-config-data" (OuterVolumeSpecName: "config-data") pod "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" (UID: "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.150871 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab0c5b09-1134-4319-890d-8d42e916fc4c" (UID: "ab0c5b09-1134-4319-890d-8d42e916fc4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.166378 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" (UID: "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.166778 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-config-data" (OuterVolumeSpecName: "config-data") pod "ab0c5b09-1134-4319-890d-8d42e916fc4c" (UID: "ab0c5b09-1134-4319-890d-8d42e916fc4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.174529 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.174569 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttk6m\" (UniqueName: \"kubernetes.io/projected/ab0c5b09-1134-4319-890d-8d42e916fc4c-kube-api-access-ttk6m\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.174584 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.174595 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjk8d\" (UniqueName: \"kubernetes.io/projected/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-kube-api-access-qjk8d\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.174607 4873 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.174617 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.174628 4873 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0c5b09-1134-4319-890d-8d42e916fc4c-logs\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.194791 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ab0c5b09-1134-4319-890d-8d42e916fc4c" (UID: "ab0c5b09-1134-4319-890d-8d42e916fc4c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.205256 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" (UID: "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.207736 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" (UID: "772434a9-08d5-499e-9ea1-e9ed0cc1e1b6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.276190 4873 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab0c5b09-1134-4319-890d-8d42e916fc4c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.276229 4873 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.276243 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.368872 4873 generic.go:334] "Generic (PLEG): container finished" podID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerID="ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb" exitCode=0 Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.368947 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6","Type":"ContainerDied","Data":"ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb"} Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.368977 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"772434a9-08d5-499e-9ea1-e9ed0cc1e1b6","Type":"ContainerDied","Data":"11443c61c72c773d1b9e2f4dbdc5cf11081029e2fa850f7683c10c10c8471883"} Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.368993 4873 scope.go:117] "RemoveContainer" containerID="ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.368994 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.370818 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.370779 4873 generic.go:334] "Generic (PLEG): container finished" podID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerID="772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c" exitCode=0 Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.370865 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c5b09-1134-4319-890d-8d42e916fc4c","Type":"ContainerDied","Data":"772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c"} Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.370898 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab0c5b09-1134-4319-890d-8d42e916fc4c","Type":"ContainerDied","Data":"bccf19aed991c39d7abd6ccba3455f58690ec6cd1d9a81513e1e6040784b81a5"} Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.392735 4873 scope.go:117] "RemoveContainer" containerID="a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.427872 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.435675 4873 scope.go:117] "RemoveContainer" containerID="ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb" Feb 19 10:05:35 crc kubenswrapper[4873]: E0219 10:05:35.436171 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb\": container with ID starting with ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb not found: ID does not exist" containerID="ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.436235 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb"} err="failed to get container status \"ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb\": rpc error: code = NotFound desc = could not find container \"ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb\": container with ID starting with ee64d4a6004bb57017d578c308d94024de77698e585848d30e9375ee1fa2c2cb not found: ID does not exist" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.436273 4873 scope.go:117] "RemoveContainer" containerID="a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74" Feb 19 10:05:35 crc kubenswrapper[4873]: E0219 10:05:35.436578 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74\": container with ID starting with a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74 not found: ID does not exist" containerID="a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.436618 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74"} err="failed to get container status \"a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74\": rpc error: code = NotFound desc = could not find container \"a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74\": container with ID starting with a591f50f852c32017b43b07ceb566ed62ac4d82f0a926ca84d57c3237e89bc74 not found: ID does not exist" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.436647 4873 scope.go:117] "RemoveContainer" containerID="772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.447917 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.459622 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.470880 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.483588 4873 scope.go:117] "RemoveContainer" containerID="109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.496047 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" path="/var/lib/kubelet/pods/772434a9-08d5-499e-9ea1-e9ed0cc1e1b6/volumes" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.497942 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab0c5b09-1134-4319-890d-8d42e916fc4c" path="/var/lib/kubelet/pods/ab0c5b09-1134-4319-890d-8d42e916fc4c/volumes" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.514578 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:05:35 crc kubenswrapper[4873]: E0219 10:05:35.515228 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerName="nova-metadata-log" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.515246 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerName="nova-metadata-log" Feb 19 10:05:35 crc kubenswrapper[4873]: E0219 10:05:35.515271 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerName="nova-api-api" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.515277 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerName="nova-api-api" Feb 19 10:05:35 crc kubenswrapper[4873]: E0219 10:05:35.515311 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561650f5-0705-4bab-903d-66bba11301ce" containerName="init" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.515318 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="561650f5-0705-4bab-903d-66bba11301ce" containerName="init" Feb 19 10:05:35 crc kubenswrapper[4873]: E0219 10:05:35.515325 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561650f5-0705-4bab-903d-66bba11301ce" containerName="dnsmasq-dns" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.515331 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="561650f5-0705-4bab-903d-66bba11301ce" containerName="dnsmasq-dns" Feb 19 10:05:35 crc kubenswrapper[4873]: E0219 10:05:35.515345 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerName="nova-api-log" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.515353 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerName="nova-api-log" Feb 19 10:05:35 crc kubenswrapper[4873]: E0219 10:05:35.515366 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerName="nova-metadata-metadata" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.515372 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerName="nova-metadata-metadata" Feb 19 10:05:35 crc kubenswrapper[4873]: E0219 10:05:35.515394 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355c3bd2-5fb4-4a28-be15-e766b61eeed9" containerName="nova-manage" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.515400 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="355c3bd2-5fb4-4a28-be15-e766b61eeed9" containerName="nova-manage" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.515590 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="561650f5-0705-4bab-903d-66bba11301ce" containerName="dnsmasq-dns" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.515601 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerName="nova-api-api" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.515615 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="772434a9-08d5-499e-9ea1-e9ed0cc1e1b6" containerName="nova-api-log" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.515629 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="355c3bd2-5fb4-4a28-be15-e766b61eeed9" containerName="nova-manage" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.516259 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerName="nova-metadata-metadata" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.516277 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0c5b09-1134-4319-890d-8d42e916fc4c" containerName="nova-metadata-log" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.517611 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.519599 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.519933 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.525748 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.527454 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.530408 4873 scope.go:117] "RemoveContainer" containerID="772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c" Feb 19 10:05:35 crc kubenswrapper[4873]: E0219 10:05:35.531932 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c\": container with ID starting with 772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c not found: ID does not exist" containerID="772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.531992 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c"} err="failed to get container status \"772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c\": rpc error: code = NotFound desc = could not find container \"772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c\": container with ID starting with 772f80f78694557364cb6aacede90edcdbc43950df0974bec13a72c5fca43e1c not found: ID does not exist" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.532021 4873 scope.go:117] "RemoveContainer" containerID="109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609" Feb 19 10:05:35 crc kubenswrapper[4873]: E0219 10:05:35.532549 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609\": container with ID starting with 109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609 not found: ID does not exist" containerID="109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.532593 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609"} err="failed to get container status \"109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609\": rpc error: code = NotFound desc = could not find container \"109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609\": container with ID starting with 109894b19d76dd1c9ccf380213b2386f244585a42e16fabe157c7ca98cdcf609 not found: ID does not exist" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.536035 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.542575 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.542953 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.543162 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.549726 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.582356 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.582401 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-config-data\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.582426 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4e613e-0a31-4191-9afb-4fd0300586f9-logs\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.582491 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-public-tls-certs\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.582659 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.582729 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-logs\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.582756 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-config-data\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.582852 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z66vr\" (UniqueName: \"kubernetes.io/projected/4f4e613e-0a31-4191-9afb-4fd0300586f9-kube-api-access-z66vr\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.582906 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.582927 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckp52\" (UniqueName: \"kubernetes.io/projected/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-kube-api-access-ckp52\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.582957 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.684618 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.684721 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-logs\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.684761 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-config-data\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.684871 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z66vr\" (UniqueName: \"kubernetes.io/projected/4f4e613e-0a31-4191-9afb-4fd0300586f9-kube-api-access-z66vr\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.684919 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.684948 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckp52\" (UniqueName: \"kubernetes.io/projected/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-kube-api-access-ckp52\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.684986 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.685017 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.685086 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-config-data\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.685209 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4e613e-0a31-4191-9afb-4fd0300586f9-logs\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.685247 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-public-tls-certs\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.686530 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-logs\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.687589 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4e613e-0a31-4191-9afb-4fd0300586f9-logs\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.691303 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-config-data\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.691313 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-public-tls-certs\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.691396 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.691635 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.692534 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.699977 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4e613e-0a31-4191-9afb-4fd0300586f9-config-data\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.701948 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.703436 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckp52\" (UniqueName: \"kubernetes.io/projected/15cbab3c-9843-4bf6-b0e8-b65dec1e5112-kube-api-access-ckp52\") pod \"nova-metadata-0\" (UID: \"15cbab3c-9843-4bf6-b0e8-b65dec1e5112\") " pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.707268 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z66vr\" (UniqueName: \"kubernetes.io/projected/4f4e613e-0a31-4191-9afb-4fd0300586f9-kube-api-access-z66vr\") pod \"nova-api-0\" (UID: \"4f4e613e-0a31-4191-9afb-4fd0300586f9\") " pod="openstack/nova-api-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.843945 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 10:05:35 crc kubenswrapper[4873]: I0219 10:05:35.857949 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 10:05:36 crc kubenswrapper[4873]: I0219 10:05:36.297667 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 10:05:36 crc kubenswrapper[4873]: W0219 10:05:36.298201 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f4e613e_0a31_4191_9afb_4fd0300586f9.slice/crio-43de6e5637d1b8678ecde0ab5263ea883f419f771bacbc5ddf5e6a1395de3e37 WatchSource:0}: Error finding container 43de6e5637d1b8678ecde0ab5263ea883f419f771bacbc5ddf5e6a1395de3e37: Status 404 returned error can't find the container with id 43de6e5637d1b8678ecde0ab5263ea883f419f771bacbc5ddf5e6a1395de3e37 Feb 19 10:05:36 crc kubenswrapper[4873]: W0219 10:05:36.375924 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15cbab3c_9843_4bf6_b0e8_b65dec1e5112.slice/crio-bafa115a07f4a93bc6f7ece8d23b0c1c97209d3481343763924ca339e6de1621 WatchSource:0}: Error finding container bafa115a07f4a93bc6f7ece8d23b0c1c97209d3481343763924ca339e6de1621: Status 404 returned error can't find the container with id bafa115a07f4a93bc6f7ece8d23b0c1c97209d3481343763924ca339e6de1621 Feb 19 10:05:36 crc kubenswrapper[4873]: I0219 10:05:36.377771 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 10:05:36 crc kubenswrapper[4873]: I0219 10:05:36.382622 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f4e613e-0a31-4191-9afb-4fd0300586f9","Type":"ContainerStarted","Data":"43de6e5637d1b8678ecde0ab5263ea883f419f771bacbc5ddf5e6a1395de3e37"} Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.340148 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.395674 4873 generic.go:334] "Generic (PLEG): container finished" podID="8db35141-6a4c-41cb-8a70-c68ab32fb2fe" containerID="0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c" exitCode=0 Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.395727 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.395788 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8db35141-6a4c-41cb-8a70-c68ab32fb2fe","Type":"ContainerDied","Data":"0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c"} Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.395903 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8db35141-6a4c-41cb-8a70-c68ab32fb2fe","Type":"ContainerDied","Data":"f055548cf56af3386b5caa98b53e876cf48bc03e71f26f63cd2f36d9a6f05688"} Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.395946 4873 scope.go:117] "RemoveContainer" containerID="0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.399427 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f4e613e-0a31-4191-9afb-4fd0300586f9","Type":"ContainerStarted","Data":"9d66b0406dd8f3c71680ca8ff73a8bc9b08c01bf307f2c8c00ef8d085181f558"} Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.399461 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f4e613e-0a31-4191-9afb-4fd0300586f9","Type":"ContainerStarted","Data":"c2cf9eb54249a68ed8392ecaa69612f7ada56c52548f065a93329538658ab765"} Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.403352 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cbab3c-9843-4bf6-b0e8-b65dec1e5112","Type":"ContainerStarted","Data":"4d76dad0283fc179b6ca7ee703579cd31efa8b206fd7cd2dbde38f28d03e9d19"} Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.403382 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cbab3c-9843-4bf6-b0e8-b65dec1e5112","Type":"ContainerStarted","Data":"7cfe4de0a0212b2f5ffbcac217726fed9cb19ad2e9e4b5106643a951701d1708"} Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.403392 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cbab3c-9843-4bf6-b0e8-b65dec1e5112","Type":"ContainerStarted","Data":"bafa115a07f4a93bc6f7ece8d23b0c1c97209d3481343763924ca339e6de1621"} Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.424534 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4nvb\" (UniqueName: \"kubernetes.io/projected/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-kube-api-access-m4nvb\") pod \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.424814 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-combined-ca-bundle\") pod \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.424889 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-config-data\") pod \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\" (UID: \"8db35141-6a4c-41cb-8a70-c68ab32fb2fe\") " Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.432662 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-kube-api-access-m4nvb" (OuterVolumeSpecName: "kube-api-access-m4nvb") pod "8db35141-6a4c-41cb-8a70-c68ab32fb2fe" (UID: "8db35141-6a4c-41cb-8a70-c68ab32fb2fe"). InnerVolumeSpecName "kube-api-access-m4nvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.436560 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.436544056 podStartE2EDuration="2.436544056s" podCreationTimestamp="2026-02-19 10:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:05:37.419183006 +0000 UTC m=+1246.708614654" watchObservedRunningTime="2026-02-19 10:05:37.436544056 +0000 UTC m=+1246.725975694" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.440942 4873 scope.go:117] "RemoveContainer" containerID="0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c" Feb 19 10:05:37 crc kubenswrapper[4873]: E0219 10:05:37.441571 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c\": container with ID starting with 0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c not found: ID does not exist" containerID="0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.441699 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c"} err="failed to get container status \"0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c\": rpc error: code = NotFound desc = could not find container \"0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c\": container with ID starting with 0076bd1c46cbf74257579e26ec45274c90ac1a1b4889da41cc03fa8662a3134c not found: ID does not exist" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.460984 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.460959032 podStartE2EDuration="2.460959032s" podCreationTimestamp="2026-02-19 10:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:05:37.450142553 +0000 UTC m=+1246.739574201" watchObservedRunningTime="2026-02-19 10:05:37.460959032 +0000 UTC m=+1246.750390670" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.464366 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-config-data" (OuterVolumeSpecName: "config-data") pod "8db35141-6a4c-41cb-8a70-c68ab32fb2fe" (UID: "8db35141-6a4c-41cb-8a70-c68ab32fb2fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.465415 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8db35141-6a4c-41cb-8a70-c68ab32fb2fe" (UID: "8db35141-6a4c-41cb-8a70-c68ab32fb2fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.528226 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.528376 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.528540 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4nvb\" (UniqueName: \"kubernetes.io/projected/8db35141-6a4c-41cb-8a70-c68ab32fb2fe-kube-api-access-m4nvb\") on node \"crc\" DevicePath \"\"" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.719210 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.730681 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.750414 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:05:37 crc kubenswrapper[4873]: E0219 10:05:37.751081 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db35141-6a4c-41cb-8a70-c68ab32fb2fe" containerName="nova-scheduler-scheduler" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.751139 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db35141-6a4c-41cb-8a70-c68ab32fb2fe" containerName="nova-scheduler-scheduler" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.751464 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db35141-6a4c-41cb-8a70-c68ab32fb2fe" containerName="nova-scheduler-scheduler" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.752514 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.756234 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.763257 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.834252 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h79t\" (UniqueName: \"kubernetes.io/projected/adb0395e-00f8-4bc6-a0a6-2b956235c58c-kube-api-access-9h79t\") pod \"nova-scheduler-0\" (UID: \"adb0395e-00f8-4bc6-a0a6-2b956235c58c\") " pod="openstack/nova-scheduler-0" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.834338 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adb0395e-00f8-4bc6-a0a6-2b956235c58c-config-data\") pod \"nova-scheduler-0\" (UID: \"adb0395e-00f8-4bc6-a0a6-2b956235c58c\") " pod="openstack/nova-scheduler-0" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.834445 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb0395e-00f8-4bc6-a0a6-2b956235c58c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adb0395e-00f8-4bc6-a0a6-2b956235c58c\") " pod="openstack/nova-scheduler-0" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.936571 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adb0395e-00f8-4bc6-a0a6-2b956235c58c-config-data\") pod \"nova-scheduler-0\" (UID: \"adb0395e-00f8-4bc6-a0a6-2b956235c58c\") " pod="openstack/nova-scheduler-0" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.936951 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb0395e-00f8-4bc6-a0a6-2b956235c58c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adb0395e-00f8-4bc6-a0a6-2b956235c58c\") " pod="openstack/nova-scheduler-0" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.937163 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h79t\" (UniqueName: \"kubernetes.io/projected/adb0395e-00f8-4bc6-a0a6-2b956235c58c-kube-api-access-9h79t\") pod \"nova-scheduler-0\" (UID: \"adb0395e-00f8-4bc6-a0a6-2b956235c58c\") " pod="openstack/nova-scheduler-0" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.940162 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adb0395e-00f8-4bc6-a0a6-2b956235c58c-config-data\") pod \"nova-scheduler-0\" (UID: \"adb0395e-00f8-4bc6-a0a6-2b956235c58c\") " pod="openstack/nova-scheduler-0" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.942463 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb0395e-00f8-4bc6-a0a6-2b956235c58c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adb0395e-00f8-4bc6-a0a6-2b956235c58c\") " pod="openstack/nova-scheduler-0" Feb 19 10:05:37 crc kubenswrapper[4873]: I0219 10:05:37.953984 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h79t\" (UniqueName: \"kubernetes.io/projected/adb0395e-00f8-4bc6-a0a6-2b956235c58c-kube-api-access-9h79t\") pod \"nova-scheduler-0\" (UID: \"adb0395e-00f8-4bc6-a0a6-2b956235c58c\") " pod="openstack/nova-scheduler-0" Feb 19 10:05:38 crc kubenswrapper[4873]: I0219 10:05:38.070592 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 10:05:38 crc kubenswrapper[4873]: I0219 10:05:38.501754 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 10:05:38 crc kubenswrapper[4873]: W0219 10:05:38.503789 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadb0395e_00f8_4bc6_a0a6_2b956235c58c.slice/crio-80d2b1106b66a9e1508060bab5ff72cc07015decc7183687b312f61552fc028b WatchSource:0}: Error finding container 80d2b1106b66a9e1508060bab5ff72cc07015decc7183687b312f61552fc028b: Status 404 returned error can't find the container with id 80d2b1106b66a9e1508060bab5ff72cc07015decc7183687b312f61552fc028b Feb 19 10:05:39 crc kubenswrapper[4873]: I0219 10:05:39.425206 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adb0395e-00f8-4bc6-a0a6-2b956235c58c","Type":"ContainerStarted","Data":"024963f8cbb55ebe7c4bbd80648c5d800f6606f9b002b8ae290401573319520a"} Feb 19 10:05:39 crc kubenswrapper[4873]: I0219 10:05:39.425543 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adb0395e-00f8-4bc6-a0a6-2b956235c58c","Type":"ContainerStarted","Data":"80d2b1106b66a9e1508060bab5ff72cc07015decc7183687b312f61552fc028b"} Feb 19 10:05:39 crc kubenswrapper[4873]: I0219 10:05:39.448783 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.448767521 podStartE2EDuration="2.448767521s" podCreationTimestamp="2026-02-19 10:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:05:39.447019368 +0000 UTC m=+1248.736451006" watchObservedRunningTime="2026-02-19 10:05:39.448767521 +0000 UTC m=+1248.738199159" Feb 19 10:05:39 crc kubenswrapper[4873]: I0219 10:05:39.494989 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db35141-6a4c-41cb-8a70-c68ab32fb2fe" path="/var/lib/kubelet/pods/8db35141-6a4c-41cb-8a70-c68ab32fb2fe/volumes" Feb 19 10:05:40 crc kubenswrapper[4873]: I0219 10:05:40.845284 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 10:05:40 crc kubenswrapper[4873]: I0219 10:05:40.846077 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 10:05:43 crc kubenswrapper[4873]: I0219 10:05:43.071746 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 10:05:45 crc kubenswrapper[4873]: I0219 10:05:45.959032 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 10:05:46 crc kubenswrapper[4873]: I0219 10:05:46.020822 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 10:05:46 crc kubenswrapper[4873]: I0219 10:05:46.021142 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 10:05:46 crc kubenswrapper[4873]: I0219 10:05:46.021256 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 10:05:47 crc kubenswrapper[4873]: I0219 10:05:47.020491 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="15cbab3c-9843-4bf6-b0e8-b65dec1e5112" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 10:05:47 crc kubenswrapper[4873]: I0219 10:05:47.020587 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="15cbab3c-9843-4bf6-b0e8-b65dec1e5112" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 10:05:47 crc kubenswrapper[4873]: I0219 10:05:47.021037 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f4e613e-0a31-4191-9afb-4fd0300586f9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 10:05:47 crc kubenswrapper[4873]: I0219 10:05:47.021872 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f4e613e-0a31-4191-9afb-4fd0300586f9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 10:05:48 crc kubenswrapper[4873]: I0219 10:05:48.071437 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 10:05:48 crc kubenswrapper[4873]: I0219 10:05:48.101221 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 10:05:49 crc kubenswrapper[4873]: I0219 10:05:49.035656 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 10:05:51 crc kubenswrapper[4873]: I0219 10:05:51.623669 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 10:05:55 crc kubenswrapper[4873]: I0219 10:05:55.850458 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 10:05:55 crc kubenswrapper[4873]: I0219 10:05:55.856741 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 10:05:55 crc kubenswrapper[4873]: I0219 10:05:55.866952 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 10:05:55 crc kubenswrapper[4873]: I0219 10:05:55.895643 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 10:05:55 crc kubenswrapper[4873]: I0219 10:05:55.896015 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 10:05:55 crc kubenswrapper[4873]: I0219 10:05:55.900008 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 10:05:55 crc kubenswrapper[4873]: I0219 10:05:55.912548 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 10:05:56 crc kubenswrapper[4873]: I0219 10:05:56.076440 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 10:05:56 crc kubenswrapper[4873]: I0219 10:05:56.081273 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 10:05:56 crc kubenswrapper[4873]: I0219 10:05:56.087626 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 10:06:03 crc kubenswrapper[4873]: I0219 10:06:03.900740 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 10:06:05 crc kubenswrapper[4873]: I0219 10:06:05.021617 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 10:06:07 crc kubenswrapper[4873]: I0219 10:06:07.310472 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="9251ac9a-275e-4622-83a2-121d59ec8cd1" containerName="rabbitmq" containerID="cri-o://bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9" gracePeriod=604797 Feb 19 10:06:08 crc kubenswrapper[4873]: I0219 10:06:08.364069 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="86685946-19ac-434a-974f-99b5beeda172" containerName="rabbitmq" containerID="cri-o://aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72" gracePeriod=604797 Feb 19 10:06:08 crc kubenswrapper[4873]: I0219 10:06:08.989735 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.142453 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjp76\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-kube-api-access-vjp76\") pod \"9251ac9a-275e-4622-83a2-121d59ec8cd1\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.142561 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"9251ac9a-275e-4622-83a2-121d59ec8cd1\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.142585 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9251ac9a-275e-4622-83a2-121d59ec8cd1-pod-info\") pod \"9251ac9a-275e-4622-83a2-121d59ec8cd1\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.143325 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-server-conf\") pod \"9251ac9a-275e-4622-83a2-121d59ec8cd1\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.143363 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-confd\") pod \"9251ac9a-275e-4622-83a2-121d59ec8cd1\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.143404 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-plugins-conf\") pod \"9251ac9a-275e-4622-83a2-121d59ec8cd1\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.143486 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-config-data\") pod \"9251ac9a-275e-4622-83a2-121d59ec8cd1\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.143547 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-tls\") pod \"9251ac9a-275e-4622-83a2-121d59ec8cd1\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.143590 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-erlang-cookie\") pod \"9251ac9a-275e-4622-83a2-121d59ec8cd1\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.143617 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-plugins\") pod \"9251ac9a-275e-4622-83a2-121d59ec8cd1\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.143715 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9251ac9a-275e-4622-83a2-121d59ec8cd1-erlang-cookie-secret\") pod \"9251ac9a-275e-4622-83a2-121d59ec8cd1\" (UID: \"9251ac9a-275e-4622-83a2-121d59ec8cd1\") " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.145150 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9251ac9a-275e-4622-83a2-121d59ec8cd1" (UID: "9251ac9a-275e-4622-83a2-121d59ec8cd1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.147370 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9251ac9a-275e-4622-83a2-121d59ec8cd1" (UID: "9251ac9a-275e-4622-83a2-121d59ec8cd1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.150832 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9251ac9a-275e-4622-83a2-121d59ec8cd1" (UID: "9251ac9a-275e-4622-83a2-121d59ec8cd1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.152114 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9251ac9a-275e-4622-83a2-121d59ec8cd1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9251ac9a-275e-4622-83a2-121d59ec8cd1" (UID: "9251ac9a-275e-4622-83a2-121d59ec8cd1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.152350 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9251ac9a-275e-4622-83a2-121d59ec8cd1-pod-info" (OuterVolumeSpecName: "pod-info") pod "9251ac9a-275e-4622-83a2-121d59ec8cd1" (UID: "9251ac9a-275e-4622-83a2-121d59ec8cd1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.154970 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9251ac9a-275e-4622-83a2-121d59ec8cd1" (UID: "9251ac9a-275e-4622-83a2-121d59ec8cd1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.155692 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-kube-api-access-vjp76" (OuterVolumeSpecName: "kube-api-access-vjp76") pod "9251ac9a-275e-4622-83a2-121d59ec8cd1" (UID: "9251ac9a-275e-4622-83a2-121d59ec8cd1"). InnerVolumeSpecName "kube-api-access-vjp76". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.159156 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "9251ac9a-275e-4622-83a2-121d59ec8cd1" (UID: "9251ac9a-275e-4622-83a2-121d59ec8cd1"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.178238 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-config-data" (OuterVolumeSpecName: "config-data") pod "9251ac9a-275e-4622-83a2-121d59ec8cd1" (UID: "9251ac9a-275e-4622-83a2-121d59ec8cd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.220652 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-server-conf" (OuterVolumeSpecName: "server-conf") pod "9251ac9a-275e-4622-83a2-121d59ec8cd1" (UID: "9251ac9a-275e-4622-83a2-121d59ec8cd1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.224397 4873 generic.go:334] "Generic (PLEG): container finished" podID="9251ac9a-275e-4622-83a2-121d59ec8cd1" containerID="bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9" exitCode=0 Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.224470 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9251ac9a-275e-4622-83a2-121d59ec8cd1","Type":"ContainerDied","Data":"bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9"} Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.224504 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9251ac9a-275e-4622-83a2-121d59ec8cd1","Type":"ContainerDied","Data":"7661fe6352a716a9db14456953448866e2c9797ab10f540b398fdf6a05d1c0b7"} Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.224525 4873 scope.go:117] "RemoveContainer" containerID="bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.224703 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.247214 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.247548 4873 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.247564 4873 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.247577 4873 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.247589 4873 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9251ac9a-275e-4622-83a2-121d59ec8cd1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.247599 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjp76\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-kube-api-access-vjp76\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.247625 4873 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.247638 4873 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9251ac9a-275e-4622-83a2-121d59ec8cd1-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.247660 4873 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.247672 4873 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9251ac9a-275e-4622-83a2-121d59ec8cd1-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.271240 4873 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.271482 4873 scope.go:117] "RemoveContainer" containerID="190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.318730 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9251ac9a-275e-4622-83a2-121d59ec8cd1" (UID: "9251ac9a-275e-4622-83a2-121d59ec8cd1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.333967 4873 scope.go:117] "RemoveContainer" containerID="bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9" Feb 19 10:06:09 crc kubenswrapper[4873]: E0219 10:06:09.334351 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9\": container with ID starting with bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9 not found: ID does not exist" containerID="bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.334380 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9"} err="failed to get container status \"bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9\": rpc error: code = NotFound desc = could not find container \"bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9\": container with ID starting with bf1da2993e733619c7b1bf83f5278da19d4e84782b865399c0e41f4b70eb6bd9 not found: ID does not exist" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.334402 4873 scope.go:117] "RemoveContainer" containerID="190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f" Feb 19 10:06:09 crc kubenswrapper[4873]: E0219 10:06:09.334925 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f\": container with ID starting with 190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f not found: ID does not exist" containerID="190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.334960 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f"} err="failed to get container status \"190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f\": rpc error: code = NotFound desc = could not find container \"190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f\": container with ID starting with 190198ff0e2017378a91068584666652381227259351cc680fcbb3817b8e453f not found: ID does not exist" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.349131 4873 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.349159 4873 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9251ac9a-275e-4622-83a2-121d59ec8cd1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.546048 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.559886 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.584887 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 10:06:09 crc kubenswrapper[4873]: E0219 10:06:09.585364 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9251ac9a-275e-4622-83a2-121d59ec8cd1" containerName="rabbitmq" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.585380 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9251ac9a-275e-4622-83a2-121d59ec8cd1" containerName="rabbitmq" Feb 19 10:06:09 crc kubenswrapper[4873]: E0219 10:06:09.585398 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9251ac9a-275e-4622-83a2-121d59ec8cd1" containerName="setup-container" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.585405 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9251ac9a-275e-4622-83a2-121d59ec8cd1" containerName="setup-container" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.585592 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="9251ac9a-275e-4622-83a2-121d59ec8cd1" containerName="rabbitmq" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.586999 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.594579 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.594789 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-fnhrw" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.594891 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.595008 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.595114 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.595751 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.595859 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.629996 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.756413 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7rq2\" (UniqueName: \"kubernetes.io/projected/d564a6d4-4702-4e96-b814-8d9f01db02e5-kube-api-access-z7rq2\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.756466 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d564a6d4-4702-4e96-b814-8d9f01db02e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.756578 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.756639 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.756744 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.756932 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.757127 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d564a6d4-4702-4e96-b814-8d9f01db02e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.757195 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d564a6d4-4702-4e96-b814-8d9f01db02e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.757225 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d564a6d4-4702-4e96-b814-8d9f01db02e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.757258 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d564a6d4-4702-4e96-b814-8d9f01db02e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.757378 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.859497 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.859561 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.859613 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d564a6d4-4702-4e96-b814-8d9f01db02e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.859635 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d564a6d4-4702-4e96-b814-8d9f01db02e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.859653 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d564a6d4-4702-4e96-b814-8d9f01db02e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.859671 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d564a6d4-4702-4e96-b814-8d9f01db02e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.859704 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.859751 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7rq2\" (UniqueName: \"kubernetes.io/projected/d564a6d4-4702-4e96-b814-8d9f01db02e5-kube-api-access-z7rq2\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.859775 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d564a6d4-4702-4e96-b814-8d9f01db02e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.859794 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.859813 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.861295 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.861619 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d564a6d4-4702-4e96-b814-8d9f01db02e5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.862178 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d564a6d4-4702-4e96-b814-8d9f01db02e5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.862675 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d564a6d4-4702-4e96-b814-8d9f01db02e5-config-data\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.867076 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.867040 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.868707 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.870696 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d564a6d4-4702-4e96-b814-8d9f01db02e5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.870810 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d564a6d4-4702-4e96-b814-8d9f01db02e5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.870855 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d564a6d4-4702-4e96-b814-8d9f01db02e5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.884092 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7rq2\" (UniqueName: \"kubernetes.io/projected/d564a6d4-4702-4e96-b814-8d9f01db02e5-kube-api-access-z7rq2\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:09 crc kubenswrapper[4873]: I0219 10:06:09.947408 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d564a6d4-4702-4e96-b814-8d9f01db02e5\") " pod="openstack/rabbitmq-server-0" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.031494 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.060674 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.165300 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-config-data\") pod \"86685946-19ac-434a-974f-99b5beeda172\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.165384 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-erlang-cookie\") pod \"86685946-19ac-434a-974f-99b5beeda172\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.165462 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"86685946-19ac-434a-974f-99b5beeda172\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.165557 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m275n\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-kube-api-access-m275n\") pod \"86685946-19ac-434a-974f-99b5beeda172\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.165608 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-server-conf\") pod \"86685946-19ac-434a-974f-99b5beeda172\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.165653 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-confd\") pod \"86685946-19ac-434a-974f-99b5beeda172\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.165849 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-plugins-conf\") pod \"86685946-19ac-434a-974f-99b5beeda172\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.166053 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/86685946-19ac-434a-974f-99b5beeda172-erlang-cookie-secret\") pod \"86685946-19ac-434a-974f-99b5beeda172\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.166201 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-plugins\") pod \"86685946-19ac-434a-974f-99b5beeda172\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.166248 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-tls\") pod \"86685946-19ac-434a-974f-99b5beeda172\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.166400 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/86685946-19ac-434a-974f-99b5beeda172-pod-info\") pod \"86685946-19ac-434a-974f-99b5beeda172\" (UID: \"86685946-19ac-434a-974f-99b5beeda172\") " Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.167869 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "86685946-19ac-434a-974f-99b5beeda172" (UID: "86685946-19ac-434a-974f-99b5beeda172"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.184260 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/86685946-19ac-434a-974f-99b5beeda172-pod-info" (OuterVolumeSpecName: "pod-info") pod "86685946-19ac-434a-974f-99b5beeda172" (UID: "86685946-19ac-434a-974f-99b5beeda172"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.185929 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "86685946-19ac-434a-974f-99b5beeda172" (UID: "86685946-19ac-434a-974f-99b5beeda172"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.191966 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-kube-api-access-m275n" (OuterVolumeSpecName: "kube-api-access-m275n") pod "86685946-19ac-434a-974f-99b5beeda172" (UID: "86685946-19ac-434a-974f-99b5beeda172"). InnerVolumeSpecName "kube-api-access-m275n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.192206 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "86685946-19ac-434a-974f-99b5beeda172" (UID: "86685946-19ac-434a-974f-99b5beeda172"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.194917 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "86685946-19ac-434a-974f-99b5beeda172" (UID: "86685946-19ac-434a-974f-99b5beeda172"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.195250 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "86685946-19ac-434a-974f-99b5beeda172" (UID: "86685946-19ac-434a-974f-99b5beeda172"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.212082 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86685946-19ac-434a-974f-99b5beeda172-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "86685946-19ac-434a-974f-99b5beeda172" (UID: "86685946-19ac-434a-974f-99b5beeda172"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.212338 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-config-data" (OuterVolumeSpecName: "config-data") pod "86685946-19ac-434a-974f-99b5beeda172" (UID: "86685946-19ac-434a-974f-99b5beeda172"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.249271 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-server-conf" (OuterVolumeSpecName: "server-conf") pod "86685946-19ac-434a-974f-99b5beeda172" (UID: "86685946-19ac-434a-974f-99b5beeda172"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.253549 4873 generic.go:334] "Generic (PLEG): container finished" podID="86685946-19ac-434a-974f-99b5beeda172" containerID="aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72" exitCode=0 Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.253594 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"86685946-19ac-434a-974f-99b5beeda172","Type":"ContainerDied","Data":"aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72"} Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.253646 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"86685946-19ac-434a-974f-99b5beeda172","Type":"ContainerDied","Data":"ef991a861997941a147c9b5a0da440f69f41ed8b1c1a849520b30accb3784df6"} Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.253667 4873 scope.go:117] "RemoveContainer" containerID="aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72" Feb 19 10:06:10 crc kubenswrapper[4873]: I0219 10:06:10.253893 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.270889 4873 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.271184 4873 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.271194 4873 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/86685946-19ac-434a-974f-99b5beeda172-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.271203 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.271213 4873 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.271251 4873 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.271260 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m275n\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-kube-api-access-m275n\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.271268 4873 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.271276 4873 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/86685946-19ac-434a-974f-99b5beeda172-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.271284 4873 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/86685946-19ac-434a-974f-99b5beeda172-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.311573 4873 scope.go:117] "RemoveContainer" containerID="853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.320878 4873 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.366638 4873 scope.go:117] "RemoveContainer" containerID="aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72" Feb 19 10:06:11 crc kubenswrapper[4873]: E0219 10:06:10.367150 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72\": container with ID starting with aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72 not found: ID does not exist" containerID="aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.367179 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72"} err="failed to get container status \"aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72\": rpc error: code = NotFound desc = could not find container \"aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72\": container with ID starting with aaf5e7116cb1bb3c6e4b474f08773f59b49977479dfd0072b42ee20f8a60cf72 not found: ID does not exist" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.367204 4873 scope.go:117] "RemoveContainer" containerID="853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb" Feb 19 10:06:11 crc kubenswrapper[4873]: E0219 10:06:10.369683 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb\": container with ID starting with 853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb not found: ID does not exist" containerID="853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.369722 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb"} err="failed to get container status \"853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb\": rpc error: code = NotFound desc = could not find container \"853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb\": container with ID starting with 853752f484df74ddd70c3a27d9c1c59cdeac53c948f829a75d4f8ed34050d1fb not found: ID does not exist" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.373094 4873 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.383351 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "86685946-19ac-434a-974f-99b5beeda172" (UID: "86685946-19ac-434a-974f-99b5beeda172"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.475234 4873 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/86685946-19ac-434a-974f-99b5beeda172-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.630572 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.873238 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.891264 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.900443 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 10:06:11 crc kubenswrapper[4873]: E0219 10:06:10.900899 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86685946-19ac-434a-974f-99b5beeda172" containerName="setup-container" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.900912 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="86685946-19ac-434a-974f-99b5beeda172" containerName="setup-container" Feb 19 10:06:11 crc kubenswrapper[4873]: E0219 10:06:10.900950 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86685946-19ac-434a-974f-99b5beeda172" containerName="rabbitmq" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.900956 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="86685946-19ac-434a-974f-99b5beeda172" containerName="rabbitmq" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.901167 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="86685946-19ac-434a-974f-99b5beeda172" containerName="rabbitmq" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.902233 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.904281 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.904424 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.904511 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.904531 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.905316 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6k7rl" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.906889 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.910448 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.913814 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.990627 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.990735 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1150426f-909f-4b05-b216-ccf29f7039eb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.990762 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.990787 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.990809 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1150426f-909f-4b05-b216-ccf29f7039eb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.990872 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1150426f-909f-4b05-b216-ccf29f7039eb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.990933 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.991012 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrrb7\" (UniqueName: \"kubernetes.io/projected/1150426f-909f-4b05-b216-ccf29f7039eb-kube-api-access-lrrb7\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.991132 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1150426f-909f-4b05-b216-ccf29f7039eb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.991201 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1150426f-909f-4b05-b216-ccf29f7039eb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:10.991252 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.092718 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1150426f-909f-4b05-b216-ccf29f7039eb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.092811 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.092849 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrrb7\" (UniqueName: \"kubernetes.io/projected/1150426f-909f-4b05-b216-ccf29f7039eb-kube-api-access-lrrb7\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.092897 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1150426f-909f-4b05-b216-ccf29f7039eb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.092926 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1150426f-909f-4b05-b216-ccf29f7039eb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.092956 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.093028 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.093062 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1150426f-909f-4b05-b216-ccf29f7039eb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.093077 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.093094 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.093122 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1150426f-909f-4b05-b216-ccf29f7039eb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.093475 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.093856 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.094365 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1150426f-909f-4b05-b216-ccf29f7039eb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.095006 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1150426f-909f-4b05-b216-ccf29f7039eb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.095298 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.099258 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1150426f-909f-4b05-b216-ccf29f7039eb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.104494 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1150426f-909f-4b05-b216-ccf29f7039eb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.112226 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.113690 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1150426f-909f-4b05-b216-ccf29f7039eb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.114771 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1150426f-909f-4b05-b216-ccf29f7039eb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.122763 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrrb7\" (UniqueName: \"kubernetes.io/projected/1150426f-909f-4b05-b216-ccf29f7039eb-kube-api-access-lrrb7\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.142204 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1150426f-909f-4b05-b216-ccf29f7039eb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.222532 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.313289 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d564a6d4-4702-4e96-b814-8d9f01db02e5","Type":"ContainerStarted","Data":"c7cc2db082581af74d278481e43014ad33c557364ed01b95368b3d16b032cd52"} Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.501587 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86685946-19ac-434a-974f-99b5beeda172" path="/var/lib/kubelet/pods/86685946-19ac-434a-974f-99b5beeda172/volumes" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.503998 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9251ac9a-275e-4622-83a2-121d59ec8cd1" path="/var/lib/kubelet/pods/9251ac9a-275e-4622-83a2-121d59ec8cd1/volumes" Feb 19 10:06:11 crc kubenswrapper[4873]: I0219 10:06:11.777281 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 10:06:12 crc kubenswrapper[4873]: I0219 10:06:12.324896 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d564a6d4-4702-4e96-b814-8d9f01db02e5","Type":"ContainerStarted","Data":"eb081e9d62d248fe118cb009fbd9c708c3a1079ddddb512798daffee0aad2659"} Feb 19 10:06:12 crc kubenswrapper[4873]: I0219 10:06:12.329114 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1150426f-909f-4b05-b216-ccf29f7039eb","Type":"ContainerStarted","Data":"b5136359797a48bb7031d97e15dfa7e628dac2b662ce623e5198317766d7b417"} Feb 19 10:06:14 crc kubenswrapper[4873]: I0219 10:06:14.349457 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1150426f-909f-4b05-b216-ccf29f7039eb","Type":"ContainerStarted","Data":"e92393b6741f6fef922d6f42af0b98f135a455cf95571e6601d11aec865d3f88"} Feb 19 10:06:18 crc kubenswrapper[4873]: I0219 10:06:18.240251 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:06:18 crc kubenswrapper[4873]: I0219 10:06:18.240804 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.317704 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fdbdb9c55-dppcj"] Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.324231 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.331405 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.355780 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fdbdb9c55-dppcj"] Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.359272 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-nb\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.359330 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-sb\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.359358 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-openstack-edpm-ipam\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.359374 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-swift-storage-0\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.359401 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-config\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.359424 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-svc\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.359448 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rw8h\" (UniqueName: \"kubernetes.io/projected/988a3bc7-bb05-4522-a7ae-7c3be4478924-kube-api-access-8rw8h\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.461224 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-nb\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.461294 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-sb\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.461334 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-openstack-edpm-ipam\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.461356 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-swift-storage-0\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.461396 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-config\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.461426 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-svc\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.461475 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rw8h\" (UniqueName: \"kubernetes.io/projected/988a3bc7-bb05-4522-a7ae-7c3be4478924-kube-api-access-8rw8h\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.462951 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-nb\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.463129 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-swift-storage-0\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.463656 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-config\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.463855 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-sb\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.464376 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-svc\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.465860 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-openstack-edpm-ipam\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.492827 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rw8h\" (UniqueName: \"kubernetes.io/projected/988a3bc7-bb05-4522-a7ae-7c3be4478924-kube-api-access-8rw8h\") pod \"dnsmasq-dns-6fdbdb9c55-dppcj\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:19 crc kubenswrapper[4873]: I0219 10:06:19.657870 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:20 crc kubenswrapper[4873]: I0219 10:06:20.118481 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fdbdb9c55-dppcj"] Feb 19 10:06:20 crc kubenswrapper[4873]: I0219 10:06:20.414718 4873 generic.go:334] "Generic (PLEG): container finished" podID="988a3bc7-bb05-4522-a7ae-7c3be4478924" containerID="c5f22a31813d866744a5f77fa474ee1d76800c4836a597abacab19d63e9b80d9" exitCode=0 Feb 19 10:06:20 crc kubenswrapper[4873]: I0219 10:06:20.414794 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" event={"ID":"988a3bc7-bb05-4522-a7ae-7c3be4478924","Type":"ContainerDied","Data":"c5f22a31813d866744a5f77fa474ee1d76800c4836a597abacab19d63e9b80d9"} Feb 19 10:06:20 crc kubenswrapper[4873]: I0219 10:06:20.415055 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" event={"ID":"988a3bc7-bb05-4522-a7ae-7c3be4478924","Type":"ContainerStarted","Data":"6aa2dd4165f3de832bd64c7d6658e46d1dc0230ed8bd63d75af9c25f8da8717b"} Feb 19 10:06:21 crc kubenswrapper[4873]: I0219 10:06:21.426717 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" event={"ID":"988a3bc7-bb05-4522-a7ae-7c3be4478924","Type":"ContainerStarted","Data":"72bdbd03b364c7b44bc9c7201f20a54c64ed8479aa1ac42baf6e1fee556e7bbc"} Feb 19 10:06:21 crc kubenswrapper[4873]: I0219 10:06:21.426991 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:21 crc kubenswrapper[4873]: I0219 10:06:21.450603 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" podStartSLOduration=2.450586434 podStartE2EDuration="2.450586434s" podCreationTimestamp="2026-02-19 10:06:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:06:21.44235877 +0000 UTC m=+1290.731790408" watchObservedRunningTime="2026-02-19 10:06:21.450586434 +0000 UTC m=+1290.740018072" Feb 19 10:06:29 crc kubenswrapper[4873]: I0219 10:06:29.660350 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:29 crc kubenswrapper[4873]: I0219 10:06:29.758165 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78d65dbfc-jjvbb"] Feb 19 10:06:29 crc kubenswrapper[4873]: I0219 10:06:29.758418 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" podUID="7fe74544-e8af-45bd-9193-2b247c5e002b" containerName="dnsmasq-dns" containerID="cri-o://2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e" gracePeriod=10 Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.000927 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c564b89cf-9v87f"] Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.004264 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.018379 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c564b89cf-9v87f"] Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.090409 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4fbv\" (UniqueName: \"kubernetes.io/projected/20253d93-eafe-45db-b11e-338714ffd978-kube-api-access-r4fbv\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.090823 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-config\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.090884 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-ovsdbserver-sb\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.090968 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.091116 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-ovsdbserver-nb\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.091270 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-dns-swift-storage-0\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.091531 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-dns-svc\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.194487 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4fbv\" (UniqueName: \"kubernetes.io/projected/20253d93-eafe-45db-b11e-338714ffd978-kube-api-access-r4fbv\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.194578 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-config\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.194617 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-ovsdbserver-sb\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.194676 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.194717 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-ovsdbserver-nb\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.194773 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-dns-swift-storage-0\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.194817 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-dns-svc\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.195854 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-dns-svc\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.196497 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.197199 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-ovsdbserver-nb\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.197704 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-config\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.197849 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-ovsdbserver-sb\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.198441 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20253d93-eafe-45db-b11e-338714ffd978-dns-swift-storage-0\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.226296 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4fbv\" (UniqueName: \"kubernetes.io/projected/20253d93-eafe-45db-b11e-338714ffd978-kube-api-access-r4fbv\") pod \"dnsmasq-dns-6c564b89cf-9v87f\" (UID: \"20253d93-eafe-45db-b11e-338714ffd978\") " pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.350056 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.351514 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.398133 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-sb\") pod \"7fe74544-e8af-45bd-9193-2b247c5e002b\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.398209 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-svc\") pod \"7fe74544-e8af-45bd-9193-2b247c5e002b\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.398324 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-swift-storage-0\") pod \"7fe74544-e8af-45bd-9193-2b247c5e002b\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.398396 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-nb\") pod \"7fe74544-e8af-45bd-9193-2b247c5e002b\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.398720 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx9vs\" (UniqueName: \"kubernetes.io/projected/7fe74544-e8af-45bd-9193-2b247c5e002b-kube-api-access-zx9vs\") pod \"7fe74544-e8af-45bd-9193-2b247c5e002b\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.398895 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-config\") pod \"7fe74544-e8af-45bd-9193-2b247c5e002b\" (UID: \"7fe74544-e8af-45bd-9193-2b247c5e002b\") " Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.408368 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe74544-e8af-45bd-9193-2b247c5e002b-kube-api-access-zx9vs" (OuterVolumeSpecName: "kube-api-access-zx9vs") pod "7fe74544-e8af-45bd-9193-2b247c5e002b" (UID: "7fe74544-e8af-45bd-9193-2b247c5e002b"). InnerVolumeSpecName "kube-api-access-zx9vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.456318 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7fe74544-e8af-45bd-9193-2b247c5e002b" (UID: "7fe74544-e8af-45bd-9193-2b247c5e002b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.464331 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7fe74544-e8af-45bd-9193-2b247c5e002b" (UID: "7fe74544-e8af-45bd-9193-2b247c5e002b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.471496 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-config" (OuterVolumeSpecName: "config") pod "7fe74544-e8af-45bd-9193-2b247c5e002b" (UID: "7fe74544-e8af-45bd-9193-2b247c5e002b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.526679 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx9vs\" (UniqueName: \"kubernetes.io/projected/7fe74544-e8af-45bd-9193-2b247c5e002b-kube-api-access-zx9vs\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.527182 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.527199 4873 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.527214 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.543038 4873 generic.go:334] "Generic (PLEG): container finished" podID="7fe74544-e8af-45bd-9193-2b247c5e002b" containerID="2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e" exitCode=0 Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.543144 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" event={"ID":"7fe74544-e8af-45bd-9193-2b247c5e002b","Type":"ContainerDied","Data":"2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e"} Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.543174 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" event={"ID":"7fe74544-e8af-45bd-9193-2b247c5e002b","Type":"ContainerDied","Data":"bc3e27b7897877cb015837d9764a384c7709da0f91ae1099d9eccef91911ed99"} Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.543220 4873 scope.go:117] "RemoveContainer" containerID="2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.543392 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d65dbfc-jjvbb" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.574941 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7fe74544-e8af-45bd-9193-2b247c5e002b" (UID: "7fe74544-e8af-45bd-9193-2b247c5e002b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.580243 4873 scope.go:117] "RemoveContainer" containerID="3584138ebb0ff2c764fd4906203bf8dafbda732c59288ffd05b66db47c5cbd6d" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.580639 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7fe74544-e8af-45bd-9193-2b247c5e002b" (UID: "7fe74544-e8af-45bd-9193-2b247c5e002b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.616276 4873 scope.go:117] "RemoveContainer" containerID="2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e" Feb 19 10:06:30 crc kubenswrapper[4873]: E0219 10:06:30.616654 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e\": container with ID starting with 2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e not found: ID does not exist" containerID="2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.616694 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e"} err="failed to get container status \"2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e\": rpc error: code = NotFound desc = could not find container \"2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e\": container with ID starting with 2ea5c5513c19c4058c8a4129fb3b5a3547db474091e874784f52fc36d3b3d60e not found: ID does not exist" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.616721 4873 scope.go:117] "RemoveContainer" containerID="3584138ebb0ff2c764fd4906203bf8dafbda732c59288ffd05b66db47c5cbd6d" Feb 19 10:06:30 crc kubenswrapper[4873]: E0219 10:06:30.616992 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3584138ebb0ff2c764fd4906203bf8dafbda732c59288ffd05b66db47c5cbd6d\": container with ID starting with 3584138ebb0ff2c764fd4906203bf8dafbda732c59288ffd05b66db47c5cbd6d not found: ID does not exist" containerID="3584138ebb0ff2c764fd4906203bf8dafbda732c59288ffd05b66db47c5cbd6d" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.617021 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3584138ebb0ff2c764fd4906203bf8dafbda732c59288ffd05b66db47c5cbd6d"} err="failed to get container status \"3584138ebb0ff2c764fd4906203bf8dafbda732c59288ffd05b66db47c5cbd6d\": rpc error: code = NotFound desc = could not find container \"3584138ebb0ff2c764fd4906203bf8dafbda732c59288ffd05b66db47c5cbd6d\": container with ID starting with 3584138ebb0ff2c764fd4906203bf8dafbda732c59288ffd05b66db47c5cbd6d not found: ID does not exist" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.628732 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.628792 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fe74544-e8af-45bd-9193-2b247c5e002b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.886243 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78d65dbfc-jjvbb"] Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.895916 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78d65dbfc-jjvbb"] Feb 19 10:06:30 crc kubenswrapper[4873]: I0219 10:06:30.906330 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c564b89cf-9v87f"] Feb 19 10:06:31 crc kubenswrapper[4873]: I0219 10:06:31.498176 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fe74544-e8af-45bd-9193-2b247c5e002b" path="/var/lib/kubelet/pods/7fe74544-e8af-45bd-9193-2b247c5e002b/volumes" Feb 19 10:06:31 crc kubenswrapper[4873]: I0219 10:06:31.558317 4873 generic.go:334] "Generic (PLEG): container finished" podID="20253d93-eafe-45db-b11e-338714ffd978" containerID="7b6fb2011e0fd962e7437b181f0d3894901e01de64f7de57469e1175d0fa86fe" exitCode=0 Feb 19 10:06:31 crc kubenswrapper[4873]: I0219 10:06:31.558366 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" event={"ID":"20253d93-eafe-45db-b11e-338714ffd978","Type":"ContainerDied","Data":"7b6fb2011e0fd962e7437b181f0d3894901e01de64f7de57469e1175d0fa86fe"} Feb 19 10:06:31 crc kubenswrapper[4873]: I0219 10:06:31.558393 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" event={"ID":"20253d93-eafe-45db-b11e-338714ffd978","Type":"ContainerStarted","Data":"fb7ad8e83fa657cd7fe3472370e61269633a748f3b0a1baf36e81604a562de5e"} Feb 19 10:06:32 crc kubenswrapper[4873]: I0219 10:06:32.569382 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" event={"ID":"20253d93-eafe-45db-b11e-338714ffd978","Type":"ContainerStarted","Data":"eddf2c14b9e6e7d59df006669936aba8a73592bd75124ff204c065af7b3f552e"} Feb 19 10:06:32 crc kubenswrapper[4873]: I0219 10:06:32.569811 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:32 crc kubenswrapper[4873]: I0219 10:06:32.595916 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" podStartSLOduration=3.5958963710000003 podStartE2EDuration="3.595896371s" podCreationTimestamp="2026-02-19 10:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:06:32.591787859 +0000 UTC m=+1301.881219497" watchObservedRunningTime="2026-02-19 10:06:32.595896371 +0000 UTC m=+1301.885328009" Feb 19 10:06:40 crc kubenswrapper[4873]: I0219 10:06:40.357308 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c564b89cf-9v87f" Feb 19 10:06:40 crc kubenswrapper[4873]: I0219 10:06:40.430531 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fdbdb9c55-dppcj"] Feb 19 10:06:40 crc kubenswrapper[4873]: I0219 10:06:40.430891 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" podUID="988a3bc7-bb05-4522-a7ae-7c3be4478924" containerName="dnsmasq-dns" containerID="cri-o://72bdbd03b364c7b44bc9c7201f20a54c64ed8479aa1ac42baf6e1fee556e7bbc" gracePeriod=10 Feb 19 10:06:40 crc kubenswrapper[4873]: I0219 10:06:40.708448 4873 generic.go:334] "Generic (PLEG): container finished" podID="988a3bc7-bb05-4522-a7ae-7c3be4478924" containerID="72bdbd03b364c7b44bc9c7201f20a54c64ed8479aa1ac42baf6e1fee556e7bbc" exitCode=0 Feb 19 10:06:40 crc kubenswrapper[4873]: I0219 10:06:40.708494 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" event={"ID":"988a3bc7-bb05-4522-a7ae-7c3be4478924","Type":"ContainerDied","Data":"72bdbd03b364c7b44bc9c7201f20a54c64ed8479aa1ac42baf6e1fee556e7bbc"} Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.042045 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.177692 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rw8h\" (UniqueName: \"kubernetes.io/projected/988a3bc7-bb05-4522-a7ae-7c3be4478924-kube-api-access-8rw8h\") pod \"988a3bc7-bb05-4522-a7ae-7c3be4478924\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.177771 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-sb\") pod \"988a3bc7-bb05-4522-a7ae-7c3be4478924\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.177872 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-openstack-edpm-ipam\") pod \"988a3bc7-bb05-4522-a7ae-7c3be4478924\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.177932 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-svc\") pod \"988a3bc7-bb05-4522-a7ae-7c3be4478924\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.178655 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-nb\") pod \"988a3bc7-bb05-4522-a7ae-7c3be4478924\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.178784 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-config\") pod \"988a3bc7-bb05-4522-a7ae-7c3be4478924\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.178809 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-swift-storage-0\") pod \"988a3bc7-bb05-4522-a7ae-7c3be4478924\" (UID: \"988a3bc7-bb05-4522-a7ae-7c3be4478924\") " Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.183412 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/988a3bc7-bb05-4522-a7ae-7c3be4478924-kube-api-access-8rw8h" (OuterVolumeSpecName: "kube-api-access-8rw8h") pod "988a3bc7-bb05-4522-a7ae-7c3be4478924" (UID: "988a3bc7-bb05-4522-a7ae-7c3be4478924"). InnerVolumeSpecName "kube-api-access-8rw8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.243507 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "988a3bc7-bb05-4522-a7ae-7c3be4478924" (UID: "988a3bc7-bb05-4522-a7ae-7c3be4478924"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.245085 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "988a3bc7-bb05-4522-a7ae-7c3be4478924" (UID: "988a3bc7-bb05-4522-a7ae-7c3be4478924"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.248000 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "988a3bc7-bb05-4522-a7ae-7c3be4478924" (UID: "988a3bc7-bb05-4522-a7ae-7c3be4478924"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.248400 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-config" (OuterVolumeSpecName: "config") pod "988a3bc7-bb05-4522-a7ae-7c3be4478924" (UID: "988a3bc7-bb05-4522-a7ae-7c3be4478924"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.251932 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "988a3bc7-bb05-4522-a7ae-7c3be4478924" (UID: "988a3bc7-bb05-4522-a7ae-7c3be4478924"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.255008 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "988a3bc7-bb05-4522-a7ae-7c3be4478924" (UID: "988a3bc7-bb05-4522-a7ae-7c3be4478924"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.281263 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.281302 4873 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.281316 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rw8h\" (UniqueName: \"kubernetes.io/projected/988a3bc7-bb05-4522-a7ae-7c3be4478924-kube-api-access-8rw8h\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.281327 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.281340 4873 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.281350 4873 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.281361 4873 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988a3bc7-bb05-4522-a7ae-7c3be4478924-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.720534 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" event={"ID":"988a3bc7-bb05-4522-a7ae-7c3be4478924","Type":"ContainerDied","Data":"6aa2dd4165f3de832bd64c7d6658e46d1dc0230ed8bd63d75af9c25f8da8717b"} Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.720588 4873 scope.go:117] "RemoveContainer" containerID="72bdbd03b364c7b44bc9c7201f20a54c64ed8479aa1ac42baf6e1fee556e7bbc" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.720889 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdbdb9c55-dppcj" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.747902 4873 scope.go:117] "RemoveContainer" containerID="c5f22a31813d866744a5f77fa474ee1d76800c4836a597abacab19d63e9b80d9" Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.761621 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fdbdb9c55-dppcj"] Feb 19 10:06:41 crc kubenswrapper[4873]: I0219 10:06:41.771571 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fdbdb9c55-dppcj"] Feb 19 10:06:43 crc kubenswrapper[4873]: I0219 10:06:43.496937 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="988a3bc7-bb05-4522-a7ae-7c3be4478924" path="/var/lib/kubelet/pods/988a3bc7-bb05-4522-a7ae-7c3be4478924/volumes" Feb 19 10:06:44 crc kubenswrapper[4873]: I0219 10:06:44.752623 4873 generic.go:334] "Generic (PLEG): container finished" podID="d564a6d4-4702-4e96-b814-8d9f01db02e5" containerID="eb081e9d62d248fe118cb009fbd9c708c3a1079ddddb512798daffee0aad2659" exitCode=0 Feb 19 10:06:44 crc kubenswrapper[4873]: I0219 10:06:44.752664 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d564a6d4-4702-4e96-b814-8d9f01db02e5","Type":"ContainerDied","Data":"eb081e9d62d248fe118cb009fbd9c708c3a1079ddddb512798daffee0aad2659"} Feb 19 10:06:46 crc kubenswrapper[4873]: I0219 10:06:46.775398 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d564a6d4-4702-4e96-b814-8d9f01db02e5","Type":"ContainerStarted","Data":"a59923a07efdebf36fc246826c0fc72de70f7236debbfc2afcfce4a1705f0602"} Feb 19 10:06:46 crc kubenswrapper[4873]: I0219 10:06:46.776134 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 10:06:46 crc kubenswrapper[4873]: I0219 10:06:46.777343 4873 generic.go:334] "Generic (PLEG): container finished" podID="1150426f-909f-4b05-b216-ccf29f7039eb" containerID="e92393b6741f6fef922d6f42af0b98f135a455cf95571e6601d11aec865d3f88" exitCode=0 Feb 19 10:06:46 crc kubenswrapper[4873]: I0219 10:06:46.777364 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1150426f-909f-4b05-b216-ccf29f7039eb","Type":"ContainerDied","Data":"e92393b6741f6fef922d6f42af0b98f135a455cf95571e6601d11aec865d3f88"} Feb 19 10:06:46 crc kubenswrapper[4873]: I0219 10:06:46.841446 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.841415854 podStartE2EDuration="37.841415854s" podCreationTimestamp="2026-02-19 10:06:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:06:46.811476702 +0000 UTC m=+1316.100908340" watchObservedRunningTime="2026-02-19 10:06:46.841415854 +0000 UTC m=+1316.130847502" Feb 19 10:06:47 crc kubenswrapper[4873]: I0219 10:06:47.889432 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1150426f-909f-4b05-b216-ccf29f7039eb","Type":"ContainerStarted","Data":"e7a020b0278c9620d00ad390cbdc9faab9ea346dca5ae496025db4c612b262de"} Feb 19 10:06:47 crc kubenswrapper[4873]: I0219 10:06:47.890659 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:06:48 crc kubenswrapper[4873]: I0219 10:06:48.241024 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:06:48 crc kubenswrapper[4873]: I0219 10:06:48.241081 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.215632 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.215612173 podStartE2EDuration="44.215612173s" podCreationTimestamp="2026-02-19 10:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:06:47.969890509 +0000 UTC m=+1317.259322147" watchObservedRunningTime="2026-02-19 10:06:54.215612173 +0000 UTC m=+1323.505043811" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.225908 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn"] Feb 19 10:06:54 crc kubenswrapper[4873]: E0219 10:06:54.226584 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988a3bc7-bb05-4522-a7ae-7c3be4478924" containerName="dnsmasq-dns" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.226618 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="988a3bc7-bb05-4522-a7ae-7c3be4478924" containerName="dnsmasq-dns" Feb 19 10:06:54 crc kubenswrapper[4873]: E0219 10:06:54.226645 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe74544-e8af-45bd-9193-2b247c5e002b" containerName="init" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.226657 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe74544-e8af-45bd-9193-2b247c5e002b" containerName="init" Feb 19 10:06:54 crc kubenswrapper[4873]: E0219 10:06:54.226697 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988a3bc7-bb05-4522-a7ae-7c3be4478924" containerName="init" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.226712 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="988a3bc7-bb05-4522-a7ae-7c3be4478924" containerName="init" Feb 19 10:06:54 crc kubenswrapper[4873]: E0219 10:06:54.226740 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe74544-e8af-45bd-9193-2b247c5e002b" containerName="dnsmasq-dns" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.226753 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe74544-e8af-45bd-9193-2b247c5e002b" containerName="dnsmasq-dns" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.227082 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe74544-e8af-45bd-9193-2b247c5e002b" containerName="dnsmasq-dns" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.227201 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="988a3bc7-bb05-4522-a7ae-7c3be4478924" containerName="dnsmasq-dns" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.228251 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.230385 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.230670 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.231011 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.232232 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.236711 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn"] Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.319869 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.320060 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgrsj\" (UniqueName: \"kubernetes.io/projected/fda37ba3-82f5-4d49-a15f-4dca53649ec7-kube-api-access-dgrsj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.320137 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.320190 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.421885 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.422086 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.422287 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgrsj\" (UniqueName: \"kubernetes.io/projected/fda37ba3-82f5-4d49-a15f-4dca53649ec7-kube-api-access-dgrsj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.422322 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.429231 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.429966 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.432938 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.445083 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgrsj\" (UniqueName: \"kubernetes.io/projected/fda37ba3-82f5-4d49-a15f-4dca53649ec7-kube-api-access-dgrsj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:54 crc kubenswrapper[4873]: I0219 10:06:54.554008 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:06:55 crc kubenswrapper[4873]: I0219 10:06:55.165291 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn"] Feb 19 10:06:55 crc kubenswrapper[4873]: I0219 10:06:55.978380 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" event={"ID":"fda37ba3-82f5-4d49-a15f-4dca53649ec7","Type":"ContainerStarted","Data":"2537d37d9435dc0666f8a0e5eee660828300fdb6865c995e2e3548737cc40d44"} Feb 19 10:07:00 crc kubenswrapper[4873]: I0219 10:07:00.064338 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 10:07:01 crc kubenswrapper[4873]: I0219 10:07:01.228309 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 10:07:07 crc kubenswrapper[4873]: I0219 10:07:07.111772 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" event={"ID":"fda37ba3-82f5-4d49-a15f-4dca53649ec7","Type":"ContainerStarted","Data":"93fa48ed56c25accad984cd4a8e384efd75d4a5f89fc72d2bd295f628f6bf22f"} Feb 19 10:07:08 crc kubenswrapper[4873]: I0219 10:07:08.146861 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" podStartSLOduration=2.502466896 podStartE2EDuration="14.146840794s" podCreationTimestamp="2026-02-19 10:06:54 +0000 UTC" firstStartedPulling="2026-02-19 10:06:55.165149033 +0000 UTC m=+1324.454580671" lastFinishedPulling="2026-02-19 10:07:06.809522931 +0000 UTC m=+1336.098954569" observedRunningTime="2026-02-19 10:07:08.136175 +0000 UTC m=+1337.425606668" watchObservedRunningTime="2026-02-19 10:07:08.146840794 +0000 UTC m=+1337.436272442" Feb 19 10:07:18 crc kubenswrapper[4873]: I0219 10:07:18.240817 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:07:18 crc kubenswrapper[4873]: I0219 10:07:18.241367 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:07:18 crc kubenswrapper[4873]: I0219 10:07:18.241414 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:07:18 crc kubenswrapper[4873]: I0219 10:07:18.241984 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe114037dbb1e5c10911ab253f48b67258ca8f08b33d891b20892a3cde8544ad"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:07:18 crc kubenswrapper[4873]: I0219 10:07:18.242055 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://fe114037dbb1e5c10911ab253f48b67258ca8f08b33d891b20892a3cde8544ad" gracePeriod=600 Feb 19 10:07:19 crc kubenswrapper[4873]: I0219 10:07:19.233742 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="fe114037dbb1e5c10911ab253f48b67258ca8f08b33d891b20892a3cde8544ad" exitCode=0 Feb 19 10:07:19 crc kubenswrapper[4873]: I0219 10:07:19.233812 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"fe114037dbb1e5c10911ab253f48b67258ca8f08b33d891b20892a3cde8544ad"} Feb 19 10:07:19 crc kubenswrapper[4873]: I0219 10:07:19.234165 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790"} Feb 19 10:07:19 crc kubenswrapper[4873]: I0219 10:07:19.234188 4873 scope.go:117] "RemoveContainer" containerID="4cf449f514dc24e840144e6f6decb8f1a064252cdbd9c34d791686fe659362f0" Feb 19 10:07:19 crc kubenswrapper[4873]: I0219 10:07:19.235892 4873 generic.go:334] "Generic (PLEG): container finished" podID="fda37ba3-82f5-4d49-a15f-4dca53649ec7" containerID="93fa48ed56c25accad984cd4a8e384efd75d4a5f89fc72d2bd295f628f6bf22f" exitCode=0 Feb 19 10:07:19 crc kubenswrapper[4873]: I0219 10:07:19.235924 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" event={"ID":"fda37ba3-82f5-4d49-a15f-4dca53649ec7","Type":"ContainerDied","Data":"93fa48ed56c25accad984cd4a8e384efd75d4a5f89fc72d2bd295f628f6bf22f"} Feb 19 10:07:20 crc kubenswrapper[4873]: I0219 10:07:20.893703 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:07:20 crc kubenswrapper[4873]: I0219 10:07:20.997918 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgrsj\" (UniqueName: \"kubernetes.io/projected/fda37ba3-82f5-4d49-a15f-4dca53649ec7-kube-api-access-dgrsj\") pod \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " Feb 19 10:07:20 crc kubenswrapper[4873]: I0219 10:07:20.998083 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-ssh-key-openstack-edpm-ipam\") pod \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " Feb 19 10:07:20 crc kubenswrapper[4873]: I0219 10:07:20.998282 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-repo-setup-combined-ca-bundle\") pod \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " Feb 19 10:07:20 crc kubenswrapper[4873]: I0219 10:07:20.998354 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-inventory\") pod \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\" (UID: \"fda37ba3-82f5-4d49-a15f-4dca53649ec7\") " Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.005256 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "fda37ba3-82f5-4d49-a15f-4dca53649ec7" (UID: "fda37ba3-82f5-4d49-a15f-4dca53649ec7"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.012324 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda37ba3-82f5-4d49-a15f-4dca53649ec7-kube-api-access-dgrsj" (OuterVolumeSpecName: "kube-api-access-dgrsj") pod "fda37ba3-82f5-4d49-a15f-4dca53649ec7" (UID: "fda37ba3-82f5-4d49-a15f-4dca53649ec7"). InnerVolumeSpecName "kube-api-access-dgrsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.027461 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fda37ba3-82f5-4d49-a15f-4dca53649ec7" (UID: "fda37ba3-82f5-4d49-a15f-4dca53649ec7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.032703 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-inventory" (OuterVolumeSpecName: "inventory") pod "fda37ba3-82f5-4d49-a15f-4dca53649ec7" (UID: "fda37ba3-82f5-4d49-a15f-4dca53649ec7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.101056 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.101129 4873 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.101143 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda37ba3-82f5-4d49-a15f-4dca53649ec7-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.101155 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgrsj\" (UniqueName: \"kubernetes.io/projected/fda37ba3-82f5-4d49-a15f-4dca53649ec7-kube-api-access-dgrsj\") on node \"crc\" DevicePath \"\"" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.263479 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" event={"ID":"fda37ba3-82f5-4d49-a15f-4dca53649ec7","Type":"ContainerDied","Data":"2537d37d9435dc0666f8a0e5eee660828300fdb6865c995e2e3548737cc40d44"} Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.263727 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2537d37d9435dc0666f8a0e5eee660828300fdb6865c995e2e3548737cc40d44" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.263788 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.434592 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6"] Feb 19 10:07:21 crc kubenswrapper[4873]: E0219 10:07:21.435032 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda37ba3-82f5-4d49-a15f-4dca53649ec7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.435050 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda37ba3-82f5-4d49-a15f-4dca53649ec7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.435257 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="fda37ba3-82f5-4d49-a15f-4dca53649ec7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.436892 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.439746 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.439985 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.440147 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.448208 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6"] Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.451848 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.508906 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmxqw\" (UniqueName: \"kubernetes.io/projected/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-kube-api-access-pmxqw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mt2n6\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.508997 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mt2n6\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.509084 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mt2n6\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.610301 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmxqw\" (UniqueName: \"kubernetes.io/projected/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-kube-api-access-pmxqw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mt2n6\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.610430 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mt2n6\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.610515 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mt2n6\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.617907 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mt2n6\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.620632 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mt2n6\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.627440 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmxqw\" (UniqueName: \"kubernetes.io/projected/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-kube-api-access-pmxqw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mt2n6\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:21 crc kubenswrapper[4873]: I0219 10:07:21.768589 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:22 crc kubenswrapper[4873]: I0219 10:07:22.458582 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6"] Feb 19 10:07:22 crc kubenswrapper[4873]: W0219 10:07:22.502925 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ba1c3b5_6b1a_4d7e_bbdd_fb492abd6647.slice/crio-a66a2ac00db4b0ce1cf689d6373930dcff701c9b030b9ffb68e1f54a4a051883 WatchSource:0}: Error finding container a66a2ac00db4b0ce1cf689d6373930dcff701c9b030b9ffb68e1f54a4a051883: Status 404 returned error can't find the container with id a66a2ac00db4b0ce1cf689d6373930dcff701c9b030b9ffb68e1f54a4a051883 Feb 19 10:07:23 crc kubenswrapper[4873]: I0219 10:07:23.283541 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" event={"ID":"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647","Type":"ContainerStarted","Data":"a66a2ac00db4b0ce1cf689d6373930dcff701c9b030b9ffb68e1f54a4a051883"} Feb 19 10:07:24 crc kubenswrapper[4873]: I0219 10:07:24.294424 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" event={"ID":"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647","Type":"ContainerStarted","Data":"9abf7155230d56a289acb3e84c2905166e6cff2383d1c2040f888bc83038da2d"} Feb 19 10:07:24 crc kubenswrapper[4873]: I0219 10:07:24.321939 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" podStartSLOduration=2.692315584 podStartE2EDuration="3.321923002s" podCreationTimestamp="2026-02-19 10:07:21 +0000 UTC" firstStartedPulling="2026-02-19 10:07:22.506118157 +0000 UTC m=+1351.795549795" lastFinishedPulling="2026-02-19 10:07:23.135725575 +0000 UTC m=+1352.425157213" observedRunningTime="2026-02-19 10:07:24.310593331 +0000 UTC m=+1353.600024969" watchObservedRunningTime="2026-02-19 10:07:24.321923002 +0000 UTC m=+1353.611354640" Feb 19 10:07:26 crc kubenswrapper[4873]: I0219 10:07:26.314581 4873 generic.go:334] "Generic (PLEG): container finished" podID="3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647" containerID="9abf7155230d56a289acb3e84c2905166e6cff2383d1c2040f888bc83038da2d" exitCode=0 Feb 19 10:07:26 crc kubenswrapper[4873]: I0219 10:07:26.314656 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" event={"ID":"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647","Type":"ContainerDied","Data":"9abf7155230d56a289acb3e84c2905166e6cff2383d1c2040f888bc83038da2d"} Feb 19 10:07:27 crc kubenswrapper[4873]: I0219 10:07:27.770600 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:27 crc kubenswrapper[4873]: I0219 10:07:27.873359 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmxqw\" (UniqueName: \"kubernetes.io/projected/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-kube-api-access-pmxqw\") pod \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " Feb 19 10:07:27 crc kubenswrapper[4873]: I0219 10:07:27.873416 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-inventory\") pod \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " Feb 19 10:07:27 crc kubenswrapper[4873]: I0219 10:07:27.873449 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-ssh-key-openstack-edpm-ipam\") pod \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\" (UID: \"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647\") " Feb 19 10:07:27 crc kubenswrapper[4873]: I0219 10:07:27.880471 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-kube-api-access-pmxqw" (OuterVolumeSpecName: "kube-api-access-pmxqw") pod "3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647" (UID: "3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647"). InnerVolumeSpecName "kube-api-access-pmxqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:07:27 crc kubenswrapper[4873]: I0219 10:07:27.908324 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647" (UID: "3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:07:27 crc kubenswrapper[4873]: I0219 10:07:27.908882 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-inventory" (OuterVolumeSpecName: "inventory") pod "3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647" (UID: "3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:07:27 crc kubenswrapper[4873]: I0219 10:07:27.975663 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmxqw\" (UniqueName: \"kubernetes.io/projected/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-kube-api-access-pmxqw\") on node \"crc\" DevicePath \"\"" Feb 19 10:07:27 crc kubenswrapper[4873]: I0219 10:07:27.975706 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:07:27 crc kubenswrapper[4873]: I0219 10:07:27.975720 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.339630 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" event={"ID":"3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647","Type":"ContainerDied","Data":"a66a2ac00db4b0ce1cf689d6373930dcff701c9b030b9ffb68e1f54a4a051883"} Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.339871 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a66a2ac00db4b0ce1cf689d6373930dcff701c9b030b9ffb68e1f54a4a051883" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.339696 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mt2n6" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.417464 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r"] Feb 19 10:07:28 crc kubenswrapper[4873]: E0219 10:07:28.417859 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.417874 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.418049 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.418826 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.421261 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.421423 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.421900 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.432946 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.437929 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r"] Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.587253 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nknhk\" (UniqueName: \"kubernetes.io/projected/fb8aa6eb-a92d-47ab-803f-664399242dde-kube-api-access-nknhk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.587567 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.588432 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.588880 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.690903 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nknhk\" (UniqueName: \"kubernetes.io/projected/fb8aa6eb-a92d-47ab-803f-664399242dde-kube-api-access-nknhk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.691293 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.691416 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.691478 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.695366 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.696191 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.700316 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.706497 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nknhk\" (UniqueName: \"kubernetes.io/projected/fb8aa6eb-a92d-47ab-803f-664399242dde-kube-api-access-nknhk\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:28 crc kubenswrapper[4873]: I0219 10:07:28.737022 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:07:29 crc kubenswrapper[4873]: I0219 10:07:29.297373 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r"] Feb 19 10:07:29 crc kubenswrapper[4873]: I0219 10:07:29.353591 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" event={"ID":"fb8aa6eb-a92d-47ab-803f-664399242dde","Type":"ContainerStarted","Data":"e61fa5adb3a25729e4ecf38db740518b7a6ba9fcd4130132a83dd97938b102a1"} Feb 19 10:07:30 crc kubenswrapper[4873]: I0219 10:07:30.366280 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" event={"ID":"fb8aa6eb-a92d-47ab-803f-664399242dde","Type":"ContainerStarted","Data":"5a3eb0b6bd6c7101e7f4001df9bf0b1c21607b64acd82737d7c08daff696a875"} Feb 19 10:07:30 crc kubenswrapper[4873]: I0219 10:07:30.390853 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" podStartSLOduration=1.98012219 podStartE2EDuration="2.390834472s" podCreationTimestamp="2026-02-19 10:07:28 +0000 UTC" firstStartedPulling="2026-02-19 10:07:29.30486108 +0000 UTC m=+1358.594292718" lastFinishedPulling="2026-02-19 10:07:29.715573372 +0000 UTC m=+1359.005005000" observedRunningTime="2026-02-19 10:07:30.381630634 +0000 UTC m=+1359.671062282" watchObservedRunningTime="2026-02-19 10:07:30.390834472 +0000 UTC m=+1359.680266110" Feb 19 10:07:55 crc kubenswrapper[4873]: I0219 10:07:55.899064 4873 scope.go:117] "RemoveContainer" containerID="2f0ffc7ea2219fb39042b2ae636be2bc871ede3a5af5f5056178cf8abfebcb4d" Feb 19 10:07:55 crc kubenswrapper[4873]: I0219 10:07:55.930573 4873 scope.go:117] "RemoveContainer" containerID="300c17fe87cdc74fea5cc1a915ff92db53e3c3a4eee6ced7352b06833035dffb" Feb 19 10:08:56 crc kubenswrapper[4873]: I0219 10:08:56.039560 4873 scope.go:117] "RemoveContainer" containerID="dfe8a7cf5aeabc3bd0899d011af3258a91fc6d795682d82ec9f1f9c569448452" Feb 19 10:08:56 crc kubenswrapper[4873]: I0219 10:08:56.105766 4873 scope.go:117] "RemoveContainer" containerID="1a1b6f4ba694daddb17f029a0bbce06c79e8294e69f096dade9d91ac98c03f81" Feb 19 10:08:56 crc kubenswrapper[4873]: I0219 10:08:56.145140 4873 scope.go:117] "RemoveContainer" containerID="f728a5cace0f3c84844ee9bd7c5a0c48b5b5cad808dd5c682427cb942eb77db6" Feb 19 10:09:18 crc kubenswrapper[4873]: I0219 10:09:18.240982 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:09:18 crc kubenswrapper[4873]: I0219 10:09:18.241623 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:09:48 crc kubenswrapper[4873]: I0219 10:09:48.240387 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:09:48 crc kubenswrapper[4873]: I0219 10:09:48.240871 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:10:18 crc kubenswrapper[4873]: I0219 10:10:18.241082 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:10:18 crc kubenswrapper[4873]: I0219 10:10:18.242010 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:10:18 crc kubenswrapper[4873]: I0219 10:10:18.242083 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:10:18 crc kubenswrapper[4873]: I0219 10:10:18.243501 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:10:18 crc kubenswrapper[4873]: I0219 10:10:18.243573 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" gracePeriod=600 Feb 19 10:10:18 crc kubenswrapper[4873]: E0219 10:10:18.367226 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:10:19 crc kubenswrapper[4873]: I0219 10:10:19.074948 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" exitCode=0 Feb 19 10:10:19 crc kubenswrapper[4873]: I0219 10:10:19.075004 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790"} Feb 19 10:10:19 crc kubenswrapper[4873]: I0219 10:10:19.075047 4873 scope.go:117] "RemoveContainer" containerID="fe114037dbb1e5c10911ab253f48b67258ca8f08b33d891b20892a3cde8544ad" Feb 19 10:10:19 crc kubenswrapper[4873]: I0219 10:10:19.075613 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:10:19 crc kubenswrapper[4873]: E0219 10:10:19.076010 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:10:33 crc kubenswrapper[4873]: I0219 10:10:33.232775 4873 generic.go:334] "Generic (PLEG): container finished" podID="fb8aa6eb-a92d-47ab-803f-664399242dde" containerID="5a3eb0b6bd6c7101e7f4001df9bf0b1c21607b64acd82737d7c08daff696a875" exitCode=0 Feb 19 10:10:33 crc kubenswrapper[4873]: I0219 10:10:33.233339 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" event={"ID":"fb8aa6eb-a92d-47ab-803f-664399242dde","Type":"ContainerDied","Data":"5a3eb0b6bd6c7101e7f4001df9bf0b1c21607b64acd82737d7c08daff696a875"} Feb 19 10:10:33 crc kubenswrapper[4873]: I0219 10:10:33.485051 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:10:33 crc kubenswrapper[4873]: E0219 10:10:33.485415 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.738410 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.758379 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-inventory\") pod \"fb8aa6eb-a92d-47ab-803f-664399242dde\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.758594 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nknhk\" (UniqueName: \"kubernetes.io/projected/fb8aa6eb-a92d-47ab-803f-664399242dde-kube-api-access-nknhk\") pod \"fb8aa6eb-a92d-47ab-803f-664399242dde\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.758653 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-ssh-key-openstack-edpm-ipam\") pod \"fb8aa6eb-a92d-47ab-803f-664399242dde\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.758783 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-bootstrap-combined-ca-bundle\") pod \"fb8aa6eb-a92d-47ab-803f-664399242dde\" (UID: \"fb8aa6eb-a92d-47ab-803f-664399242dde\") " Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.766880 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fb8aa6eb-a92d-47ab-803f-664399242dde" (UID: "fb8aa6eb-a92d-47ab-803f-664399242dde"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.767064 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8aa6eb-a92d-47ab-803f-664399242dde-kube-api-access-nknhk" (OuterVolumeSpecName: "kube-api-access-nknhk") pod "fb8aa6eb-a92d-47ab-803f-664399242dde" (UID: "fb8aa6eb-a92d-47ab-803f-664399242dde"). InnerVolumeSpecName "kube-api-access-nknhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.808558 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-inventory" (OuterVolumeSpecName: "inventory") pod "fb8aa6eb-a92d-47ab-803f-664399242dde" (UID: "fb8aa6eb-a92d-47ab-803f-664399242dde"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.810595 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fb8aa6eb-a92d-47ab-803f-664399242dde" (UID: "fb8aa6eb-a92d-47ab-803f-664399242dde"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.862328 4873 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.862380 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.862393 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nknhk\" (UniqueName: \"kubernetes.io/projected/fb8aa6eb-a92d-47ab-803f-664399242dde-kube-api-access-nknhk\") on node \"crc\" DevicePath \"\"" Feb 19 10:10:34 crc kubenswrapper[4873]: I0219 10:10:34.862404 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb8aa6eb-a92d-47ab-803f-664399242dde-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.258921 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" event={"ID":"fb8aa6eb-a92d-47ab-803f-664399242dde","Type":"ContainerDied","Data":"e61fa5adb3a25729e4ecf38db740518b7a6ba9fcd4130132a83dd97938b102a1"} Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.258988 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e61fa5adb3a25729e4ecf38db740518b7a6ba9fcd4130132a83dd97938b102a1" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.259036 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.344051 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj"] Feb 19 10:10:35 crc kubenswrapper[4873]: E0219 10:10:35.344667 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8aa6eb-a92d-47ab-803f-664399242dde" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.344695 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8aa6eb-a92d-47ab-803f-664399242dde" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.345005 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8aa6eb-a92d-47ab-803f-664399242dde" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.345960 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.348960 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.349955 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.350286 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.353609 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.361447 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj"] Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.377206 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.377317 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.377407 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj8jp\" (UniqueName: \"kubernetes.io/projected/ab7d5a49-ac61-4963-8766-1716098f3d4c-kube-api-access-cj8jp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.480520 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.480726 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.480851 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj8jp\" (UniqueName: \"kubernetes.io/projected/ab7d5a49-ac61-4963-8766-1716098f3d4c-kube-api-access-cj8jp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.486349 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.487492 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.515588 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj8jp\" (UniqueName: \"kubernetes.io/projected/ab7d5a49-ac61-4963-8766-1716098f3d4c-kube-api-access-cj8jp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:10:35 crc kubenswrapper[4873]: I0219 10:10:35.663887 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:10:36 crc kubenswrapper[4873]: I0219 10:10:36.272845 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 10:10:36 crc kubenswrapper[4873]: I0219 10:10:36.278599 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj"] Feb 19 10:10:37 crc kubenswrapper[4873]: I0219 10:10:37.283140 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" event={"ID":"ab7d5a49-ac61-4963-8766-1716098f3d4c","Type":"ContainerStarted","Data":"5a71dd5b4760261dd2be0e41411c70e6162e350c368db3bf6d1ef5a664d01e28"} Feb 19 10:10:37 crc kubenswrapper[4873]: I0219 10:10:37.283507 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" event={"ID":"ab7d5a49-ac61-4963-8766-1716098f3d4c","Type":"ContainerStarted","Data":"104a5f4bca47feaf4bdaf13262521e9dce1f7e82a92383ad952b55930e0a5622"} Feb 19 10:10:37 crc kubenswrapper[4873]: I0219 10:10:37.310587 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" podStartSLOduration=1.8451071959999998 podStartE2EDuration="2.310550032s" podCreationTimestamp="2026-02-19 10:10:35 +0000 UTC" firstStartedPulling="2026-02-19 10:10:36.272476873 +0000 UTC m=+1545.561908511" lastFinishedPulling="2026-02-19 10:10:36.737919709 +0000 UTC m=+1546.027351347" observedRunningTime="2026-02-19 10:10:37.303052574 +0000 UTC m=+1546.592484212" watchObservedRunningTime="2026-02-19 10:10:37.310550032 +0000 UTC m=+1546.599981670" Feb 19 10:10:45 crc kubenswrapper[4873]: I0219 10:10:45.484557 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:10:45 crc kubenswrapper[4873]: E0219 10:10:45.485386 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:10:58 crc kubenswrapper[4873]: I0219 10:10:58.484704 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:10:58 crc kubenswrapper[4873]: E0219 10:10:58.485450 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.087825 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dn7mz"] Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.090209 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.107863 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dn7mz"] Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.135387 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-utilities\") pod \"community-operators-dn7mz\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.135460 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-catalog-content\") pod \"community-operators-dn7mz\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.135520 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqtgw\" (UniqueName: \"kubernetes.io/projected/9454e008-863b-47f7-8b39-98d7b7a128cb-kube-api-access-nqtgw\") pod \"community-operators-dn7mz\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.236514 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqtgw\" (UniqueName: \"kubernetes.io/projected/9454e008-863b-47f7-8b39-98d7b7a128cb-kube-api-access-nqtgw\") pod \"community-operators-dn7mz\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.236788 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-utilities\") pod \"community-operators-dn7mz\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.236821 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-catalog-content\") pod \"community-operators-dn7mz\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.237616 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-catalog-content\") pod \"community-operators-dn7mz\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.237673 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-utilities\") pod \"community-operators-dn7mz\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.262507 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqtgw\" (UniqueName: \"kubernetes.io/projected/9454e008-863b-47f7-8b39-98d7b7a128cb-kube-api-access-nqtgw\") pod \"community-operators-dn7mz\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:00 crc kubenswrapper[4873]: I0219 10:11:00.434878 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:01 crc kubenswrapper[4873]: I0219 10:11:01.050778 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dn7mz"] Feb 19 10:11:01 crc kubenswrapper[4873]: I0219 10:11:01.509535 4873 generic.go:334] "Generic (PLEG): container finished" podID="9454e008-863b-47f7-8b39-98d7b7a128cb" containerID="d9588b3b223ddbc522f8374f58e421f3129c9acc0d3e1705f27d808d6e5c4c4a" exitCode=0 Feb 19 10:11:01 crc kubenswrapper[4873]: I0219 10:11:01.509635 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn7mz" event={"ID":"9454e008-863b-47f7-8b39-98d7b7a128cb","Type":"ContainerDied","Data":"d9588b3b223ddbc522f8374f58e421f3129c9acc0d3e1705f27d808d6e5c4c4a"} Feb 19 10:11:01 crc kubenswrapper[4873]: I0219 10:11:01.509880 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn7mz" event={"ID":"9454e008-863b-47f7-8b39-98d7b7a128cb","Type":"ContainerStarted","Data":"6016678eb243c8a893a20317cd6ef0f121cfa445fec2561420da8503f1d94afe"} Feb 19 10:11:02 crc kubenswrapper[4873]: I0219 10:11:02.522122 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn7mz" event={"ID":"9454e008-863b-47f7-8b39-98d7b7a128cb","Type":"ContainerStarted","Data":"73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119"} Feb 19 10:11:03 crc kubenswrapper[4873]: I0219 10:11:03.532346 4873 generic.go:334] "Generic (PLEG): container finished" podID="9454e008-863b-47f7-8b39-98d7b7a128cb" containerID="73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119" exitCode=0 Feb 19 10:11:03 crc kubenswrapper[4873]: I0219 10:11:03.532408 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn7mz" event={"ID":"9454e008-863b-47f7-8b39-98d7b7a128cb","Type":"ContainerDied","Data":"73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119"} Feb 19 10:11:04 crc kubenswrapper[4873]: I0219 10:11:04.546496 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn7mz" event={"ID":"9454e008-863b-47f7-8b39-98d7b7a128cb","Type":"ContainerStarted","Data":"e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480"} Feb 19 10:11:04 crc kubenswrapper[4873]: I0219 10:11:04.570157 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dn7mz" podStartSLOduration=2.089625965 podStartE2EDuration="4.570137303s" podCreationTimestamp="2026-02-19 10:11:00 +0000 UTC" firstStartedPulling="2026-02-19 10:11:01.511137255 +0000 UTC m=+1570.800568893" lastFinishedPulling="2026-02-19 10:11:03.991648593 +0000 UTC m=+1573.281080231" observedRunningTime="2026-02-19 10:11:04.56921074 +0000 UTC m=+1573.858642378" watchObservedRunningTime="2026-02-19 10:11:04.570137303 +0000 UTC m=+1573.859568931" Feb 19 10:11:10 crc kubenswrapper[4873]: I0219 10:11:10.435723 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:10 crc kubenswrapper[4873]: I0219 10:11:10.437431 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:10 crc kubenswrapper[4873]: I0219 10:11:10.481376 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:10 crc kubenswrapper[4873]: I0219 10:11:10.680531 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:10 crc kubenswrapper[4873]: I0219 10:11:10.733014 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dn7mz"] Feb 19 10:11:12 crc kubenswrapper[4873]: I0219 10:11:12.632893 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dn7mz" podUID="9454e008-863b-47f7-8b39-98d7b7a128cb" containerName="registry-server" containerID="cri-o://e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480" gracePeriod=2 Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.301252 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.422881 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqtgw\" (UniqueName: \"kubernetes.io/projected/9454e008-863b-47f7-8b39-98d7b7a128cb-kube-api-access-nqtgw\") pod \"9454e008-863b-47f7-8b39-98d7b7a128cb\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.423182 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-catalog-content\") pod \"9454e008-863b-47f7-8b39-98d7b7a128cb\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.423314 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-utilities\") pod \"9454e008-863b-47f7-8b39-98d7b7a128cb\" (UID: \"9454e008-863b-47f7-8b39-98d7b7a128cb\") " Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.424613 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-utilities" (OuterVolumeSpecName: "utilities") pod "9454e008-863b-47f7-8b39-98d7b7a128cb" (UID: "9454e008-863b-47f7-8b39-98d7b7a128cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.429632 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9454e008-863b-47f7-8b39-98d7b7a128cb-kube-api-access-nqtgw" (OuterVolumeSpecName: "kube-api-access-nqtgw") pod "9454e008-863b-47f7-8b39-98d7b7a128cb" (UID: "9454e008-863b-47f7-8b39-98d7b7a128cb"). InnerVolumeSpecName "kube-api-access-nqtgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.479052 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9454e008-863b-47f7-8b39-98d7b7a128cb" (UID: "9454e008-863b-47f7-8b39-98d7b7a128cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.484284 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:11:13 crc kubenswrapper[4873]: E0219 10:11:13.484637 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.532047 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqtgw\" (UniqueName: \"kubernetes.io/projected/9454e008-863b-47f7-8b39-98d7b7a128cb-kube-api-access-nqtgw\") on node \"crc\" DevicePath \"\"" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.532286 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.532325 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9454e008-863b-47f7-8b39-98d7b7a128cb-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.644502 4873 generic.go:334] "Generic (PLEG): container finished" podID="9454e008-863b-47f7-8b39-98d7b7a128cb" containerID="e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480" exitCode=0 Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.644593 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dn7mz" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.644611 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn7mz" event={"ID":"9454e008-863b-47f7-8b39-98d7b7a128cb","Type":"ContainerDied","Data":"e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480"} Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.645570 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dn7mz" event={"ID":"9454e008-863b-47f7-8b39-98d7b7a128cb","Type":"ContainerDied","Data":"6016678eb243c8a893a20317cd6ef0f121cfa445fec2561420da8503f1d94afe"} Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.645615 4873 scope.go:117] "RemoveContainer" containerID="e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.668727 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dn7mz"] Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.671977 4873 scope.go:117] "RemoveContainer" containerID="73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.680196 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dn7mz"] Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.701514 4873 scope.go:117] "RemoveContainer" containerID="d9588b3b223ddbc522f8374f58e421f3129c9acc0d3e1705f27d808d6e5c4c4a" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.735984 4873 scope.go:117] "RemoveContainer" containerID="e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480" Feb 19 10:11:13 crc kubenswrapper[4873]: E0219 10:11:13.736445 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480\": container with ID starting with e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480 not found: ID does not exist" containerID="e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.736476 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480"} err="failed to get container status \"e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480\": rpc error: code = NotFound desc = could not find container \"e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480\": container with ID starting with e52180c8384e35a2f5bf2a0b8136587b5856388dc38cf5f3c931307a85591480 not found: ID does not exist" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.736498 4873 scope.go:117] "RemoveContainer" containerID="73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119" Feb 19 10:11:13 crc kubenswrapper[4873]: E0219 10:11:13.736718 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119\": container with ID starting with 73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119 not found: ID does not exist" containerID="73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.736752 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119"} err="failed to get container status \"73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119\": rpc error: code = NotFound desc = could not find container \"73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119\": container with ID starting with 73015bd366aa86c3d47a4a486cb2b9e5d213b08253942a273250f467cf812119 not found: ID does not exist" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.736771 4873 scope.go:117] "RemoveContainer" containerID="d9588b3b223ddbc522f8374f58e421f3129c9acc0d3e1705f27d808d6e5c4c4a" Feb 19 10:11:13 crc kubenswrapper[4873]: E0219 10:11:13.737057 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9588b3b223ddbc522f8374f58e421f3129c9acc0d3e1705f27d808d6e5c4c4a\": container with ID starting with d9588b3b223ddbc522f8374f58e421f3129c9acc0d3e1705f27d808d6e5c4c4a not found: ID does not exist" containerID="d9588b3b223ddbc522f8374f58e421f3129c9acc0d3e1705f27d808d6e5c4c4a" Feb 19 10:11:13 crc kubenswrapper[4873]: I0219 10:11:13.737075 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9588b3b223ddbc522f8374f58e421f3129c9acc0d3e1705f27d808d6e5c4c4a"} err="failed to get container status \"d9588b3b223ddbc522f8374f58e421f3129c9acc0d3e1705f27d808d6e5c4c4a\": rpc error: code = NotFound desc = could not find container \"d9588b3b223ddbc522f8374f58e421f3129c9acc0d3e1705f27d808d6e5c4c4a\": container with ID starting with d9588b3b223ddbc522f8374f58e421f3129c9acc0d3e1705f27d808d6e5c4c4a not found: ID does not exist" Feb 19 10:11:15 crc kubenswrapper[4873]: I0219 10:11:15.498119 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9454e008-863b-47f7-8b39-98d7b7a128cb" path="/var/lib/kubelet/pods/9454e008-863b-47f7-8b39-98d7b7a128cb/volumes" Feb 19 10:11:27 crc kubenswrapper[4873]: I0219 10:11:27.484678 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:11:27 crc kubenswrapper[4873]: E0219 10:11:27.485552 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.049917 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-p55tt"] Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.061363 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-p55tt"] Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.073421 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c73a-account-create-update-zxxrn"] Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.082634 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c73a-account-create-update-zxxrn"] Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.145166 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7nxdn"] Feb 19 10:11:28 crc kubenswrapper[4873]: E0219 10:11:28.146145 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9454e008-863b-47f7-8b39-98d7b7a128cb" containerName="extract-utilities" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.146162 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9454e008-863b-47f7-8b39-98d7b7a128cb" containerName="extract-utilities" Feb 19 10:11:28 crc kubenswrapper[4873]: E0219 10:11:28.146200 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9454e008-863b-47f7-8b39-98d7b7a128cb" containerName="registry-server" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.146209 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9454e008-863b-47f7-8b39-98d7b7a128cb" containerName="registry-server" Feb 19 10:11:28 crc kubenswrapper[4873]: E0219 10:11:28.146245 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9454e008-863b-47f7-8b39-98d7b7a128cb" containerName="extract-content" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.146253 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9454e008-863b-47f7-8b39-98d7b7a128cb" containerName="extract-content" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.146499 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="9454e008-863b-47f7-8b39-98d7b7a128cb" containerName="registry-server" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.148373 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.173053 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7nxdn"] Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.232586 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zjr\" (UniqueName: \"kubernetes.io/projected/64a48b6f-c0a6-4566-82a1-649e91bcd486-kube-api-access-d4zjr\") pod \"redhat-marketplace-7nxdn\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.232908 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-utilities\") pod \"redhat-marketplace-7nxdn\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.233487 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-catalog-content\") pod \"redhat-marketplace-7nxdn\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.335897 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-catalog-content\") pod \"redhat-marketplace-7nxdn\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.335999 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4zjr\" (UniqueName: \"kubernetes.io/projected/64a48b6f-c0a6-4566-82a1-649e91bcd486-kube-api-access-d4zjr\") pod \"redhat-marketplace-7nxdn\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.336498 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-catalog-content\") pod \"redhat-marketplace-7nxdn\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.336563 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-utilities\") pod \"redhat-marketplace-7nxdn\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.336945 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-utilities\") pod \"redhat-marketplace-7nxdn\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.365234 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4zjr\" (UniqueName: \"kubernetes.io/projected/64a48b6f-c0a6-4566-82a1-649e91bcd486-kube-api-access-d4zjr\") pod \"redhat-marketplace-7nxdn\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.485386 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:28 crc kubenswrapper[4873]: I0219 10:11:28.969172 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7nxdn"] Feb 19 10:11:29 crc kubenswrapper[4873]: I0219 10:11:29.497603 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d81f72af-8420-4334-811e-f0e0cc1c7731" path="/var/lib/kubelet/pods/d81f72af-8420-4334-811e-f0e0cc1c7731/volumes" Feb 19 10:11:29 crc kubenswrapper[4873]: I0219 10:11:29.498417 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88" path="/var/lib/kubelet/pods/fd0e3e74-f1aa-4b5f-a2ae-b89f90644f88/volumes" Feb 19 10:11:29 crc kubenswrapper[4873]: I0219 10:11:29.793020 4873 generic.go:334] "Generic (PLEG): container finished" podID="64a48b6f-c0a6-4566-82a1-649e91bcd486" containerID="d1ee2b33d1585962e5c2d8f8deb1f53ccd7bfb877b0017578ca6bff8f7dfd26e" exitCode=0 Feb 19 10:11:29 crc kubenswrapper[4873]: I0219 10:11:29.793076 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nxdn" event={"ID":"64a48b6f-c0a6-4566-82a1-649e91bcd486","Type":"ContainerDied","Data":"d1ee2b33d1585962e5c2d8f8deb1f53ccd7bfb877b0017578ca6bff8f7dfd26e"} Feb 19 10:11:29 crc kubenswrapper[4873]: I0219 10:11:29.793123 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nxdn" event={"ID":"64a48b6f-c0a6-4566-82a1-649e91bcd486","Type":"ContainerStarted","Data":"f6db4c0d2d18a7e3a75dd04979a22304b269c40be6cddc6d1fda3629593b15c0"} Feb 19 10:11:31 crc kubenswrapper[4873]: I0219 10:11:31.817410 4873 generic.go:334] "Generic (PLEG): container finished" podID="64a48b6f-c0a6-4566-82a1-649e91bcd486" containerID="9d6de769e9d17333501a00980ac56829127a539f53f95bb25cf420c5630db360" exitCode=0 Feb 19 10:11:31 crc kubenswrapper[4873]: I0219 10:11:31.817523 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nxdn" event={"ID":"64a48b6f-c0a6-4566-82a1-649e91bcd486","Type":"ContainerDied","Data":"9d6de769e9d17333501a00980ac56829127a539f53f95bb25cf420c5630db360"} Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.035070 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-46kds"] Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.045614 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-46kds"] Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.056781 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-r4fbt"] Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.070561 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-e187-account-create-update-4xb7l"] Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.082142 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-r4fbt"] Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.094213 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f064-account-create-update-flh2f"] Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.105614 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-e187-account-create-update-4xb7l"] Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.116361 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f064-account-create-update-flh2f"] Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.521125 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="179cf76d-a15d-4bce-be42-18ad2e4abb94" path="/var/lib/kubelet/pods/179cf76d-a15d-4bce-be42-18ad2e4abb94/volumes" Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.806340 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6584bab0-12c6-4bce-99be-d38f3748f896" path="/var/lib/kubelet/pods/6584bab0-12c6-4bce-99be-d38f3748f896/volumes" Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.837227 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1770246-951b-40da-a0a2-4320dde71437" path="/var/lib/kubelet/pods/a1770246-951b-40da-a0a2-4320dde71437/volumes" Feb 19 10:11:33 crc kubenswrapper[4873]: I0219 10:11:33.844918 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af085fbb-9aaa-4d01-8a0f-a061acf3a845" path="/var/lib/kubelet/pods/af085fbb-9aaa-4d01-8a0f-a061acf3a845/volumes" Feb 19 10:11:34 crc kubenswrapper[4873]: I0219 10:11:34.033568 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-087c-account-create-update-qnlsx"] Feb 19 10:11:34 crc kubenswrapper[4873]: I0219 10:11:34.046318 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-dftzh"] Feb 19 10:11:34 crc kubenswrapper[4873]: I0219 10:11:34.056660 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-dftzh"] Feb 19 10:11:34 crc kubenswrapper[4873]: I0219 10:11:34.066633 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-087c-account-create-update-qnlsx"] Feb 19 10:11:34 crc kubenswrapper[4873]: I0219 10:11:34.850058 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nxdn" event={"ID":"64a48b6f-c0a6-4566-82a1-649e91bcd486","Type":"ContainerStarted","Data":"05b98d967cad1ee13631fa999a1fe6d672e3aca8c59af9321211e972aeea3daf"} Feb 19 10:11:34 crc kubenswrapper[4873]: I0219 10:11:34.868578 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7nxdn" podStartSLOduration=2.74214507 podStartE2EDuration="6.868559756s" podCreationTimestamp="2026-02-19 10:11:28 +0000 UTC" firstStartedPulling="2026-02-19 10:11:29.795895411 +0000 UTC m=+1599.085327049" lastFinishedPulling="2026-02-19 10:11:33.922310097 +0000 UTC m=+1603.211741735" observedRunningTime="2026-02-19 10:11:34.868059414 +0000 UTC m=+1604.157491062" watchObservedRunningTime="2026-02-19 10:11:34.868559756 +0000 UTC m=+1604.157991394" Feb 19 10:11:35 crc kubenswrapper[4873]: I0219 10:11:35.498420 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0cc2ef-89a2-4220-8b44-7fc71537ab50" path="/var/lib/kubelet/pods/7b0cc2ef-89a2-4220-8b44-7fc71537ab50/volumes" Feb 19 10:11:35 crc kubenswrapper[4873]: I0219 10:11:35.499948 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4" path="/var/lib/kubelet/pods/bd8b7b2e-f4a8-4af9-99aa-a1e8c3d78bd4/volumes" Feb 19 10:11:38 crc kubenswrapper[4873]: I0219 10:11:38.484328 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:11:38 crc kubenswrapper[4873]: E0219 10:11:38.484848 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:11:38 crc kubenswrapper[4873]: I0219 10:11:38.485658 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:38 crc kubenswrapper[4873]: I0219 10:11:38.485801 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:38 crc kubenswrapper[4873]: I0219 10:11:38.544248 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:39 crc kubenswrapper[4873]: I0219 10:11:39.968341 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:40 crc kubenswrapper[4873]: I0219 10:11:40.060726 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-kvmj2"] Feb 19 10:11:40 crc kubenswrapper[4873]: I0219 10:11:40.073515 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-kvmj2"] Feb 19 10:11:40 crc kubenswrapper[4873]: I0219 10:11:40.086320 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7nxdn"] Feb 19 10:11:41 crc kubenswrapper[4873]: I0219 10:11:41.516117 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a6680f-7e8e-4326-9401-fde957599477" path="/var/lib/kubelet/pods/29a6680f-7e8e-4326-9401-fde957599477/volumes" Feb 19 10:11:41 crc kubenswrapper[4873]: I0219 10:11:41.917413 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7nxdn" podUID="64a48b6f-c0a6-4566-82a1-649e91bcd486" containerName="registry-server" containerID="cri-o://05b98d967cad1ee13631fa999a1fe6d672e3aca8c59af9321211e972aeea3daf" gracePeriod=2 Feb 19 10:11:42 crc kubenswrapper[4873]: I0219 10:11:42.930068 4873 generic.go:334] "Generic (PLEG): container finished" podID="64a48b6f-c0a6-4566-82a1-649e91bcd486" containerID="05b98d967cad1ee13631fa999a1fe6d672e3aca8c59af9321211e972aeea3daf" exitCode=0 Feb 19 10:11:42 crc kubenswrapper[4873]: I0219 10:11:42.930153 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nxdn" event={"ID":"64a48b6f-c0a6-4566-82a1-649e91bcd486","Type":"ContainerDied","Data":"05b98d967cad1ee13631fa999a1fe6d672e3aca8c59af9321211e972aeea3daf"} Feb 19 10:11:42 crc kubenswrapper[4873]: I0219 10:11:42.930448 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nxdn" event={"ID":"64a48b6f-c0a6-4566-82a1-649e91bcd486","Type":"ContainerDied","Data":"f6db4c0d2d18a7e3a75dd04979a22304b269c40be6cddc6d1fda3629593b15c0"} Feb 19 10:11:42 crc kubenswrapper[4873]: I0219 10:11:42.930465 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6db4c0d2d18a7e3a75dd04979a22304b269c40be6cddc6d1fda3629593b15c0" Feb 19 10:11:42 crc kubenswrapper[4873]: I0219 10:11:42.972817 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.082492 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-utilities\") pod \"64a48b6f-c0a6-4566-82a1-649e91bcd486\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.082717 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4zjr\" (UniqueName: \"kubernetes.io/projected/64a48b6f-c0a6-4566-82a1-649e91bcd486-kube-api-access-d4zjr\") pod \"64a48b6f-c0a6-4566-82a1-649e91bcd486\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.082756 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-catalog-content\") pod \"64a48b6f-c0a6-4566-82a1-649e91bcd486\" (UID: \"64a48b6f-c0a6-4566-82a1-649e91bcd486\") " Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.083315 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-utilities" (OuterVolumeSpecName: "utilities") pod "64a48b6f-c0a6-4566-82a1-649e91bcd486" (UID: "64a48b6f-c0a6-4566-82a1-649e91bcd486"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.102240 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a48b6f-c0a6-4566-82a1-649e91bcd486-kube-api-access-d4zjr" (OuterVolumeSpecName: "kube-api-access-d4zjr") pod "64a48b6f-c0a6-4566-82a1-649e91bcd486" (UID: "64a48b6f-c0a6-4566-82a1-649e91bcd486"). InnerVolumeSpecName "kube-api-access-d4zjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.114047 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64a48b6f-c0a6-4566-82a1-649e91bcd486" (UID: "64a48b6f-c0a6-4566-82a1-649e91bcd486"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.185485 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4zjr\" (UniqueName: \"kubernetes.io/projected/64a48b6f-c0a6-4566-82a1-649e91bcd486-kube-api-access-d4zjr\") on node \"crc\" DevicePath \"\"" Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.185541 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.185554 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64a48b6f-c0a6-4566-82a1-649e91bcd486-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.938740 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7nxdn" Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.974461 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7nxdn"] Feb 19 10:11:43 crc kubenswrapper[4873]: I0219 10:11:43.984576 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7nxdn"] Feb 19 10:11:46 crc kubenswrapper[4873]: I0219 10:11:46.384079 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64a48b6f-c0a6-4566-82a1-649e91bcd486" path="/var/lib/kubelet/pods/64a48b6f-c0a6-4566-82a1-649e91bcd486/volumes" Feb 19 10:11:53 crc kubenswrapper[4873]: I0219 10:11:53.485428 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:11:53 crc kubenswrapper[4873]: E0219 10:11:53.486306 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:11:56 crc kubenswrapper[4873]: I0219 10:11:56.461319 4873 scope.go:117] "RemoveContainer" containerID="97c6c0035f5f6c9762dd68933f3909de6f99dfa1fe212cf2c55b0644dfffdb93" Feb 19 10:11:56 crc kubenswrapper[4873]: I0219 10:11:56.499936 4873 scope.go:117] "RemoveContainer" containerID="d63b34383441b0e539673e31cf4ea017f3d4fcdbd72ad26d47bf96c33fcf565d" Feb 19 10:11:56 crc kubenswrapper[4873]: I0219 10:11:56.561064 4873 scope.go:117] "RemoveContainer" containerID="7d6d1faa851ee46aca753c0c6509416782269dc982725ceddb2cd7f19fc16f13" Feb 19 10:11:56 crc kubenswrapper[4873]: I0219 10:11:56.640389 4873 scope.go:117] "RemoveContainer" containerID="6a370733fa679d2517624889ae788a6c37c512bf2894dbe6a54f6e24bdad6056" Feb 19 10:11:56 crc kubenswrapper[4873]: I0219 10:11:56.682259 4873 scope.go:117] "RemoveContainer" containerID="5b7d076de566fc4d5772c9116560231f2acaebd0aad62281f0cb88965b142cc2" Feb 19 10:11:56 crc kubenswrapper[4873]: I0219 10:11:56.724426 4873 scope.go:117] "RemoveContainer" containerID="f0d3f3ad8d69a092fbacd08190bbe079ce8644eec25f2003bbee9cc3d511dd9c" Feb 19 10:11:56 crc kubenswrapper[4873]: I0219 10:11:56.779911 4873 scope.go:117] "RemoveContainer" containerID="3a7ee324cc97736a2be2ff10cda880e991b9ebce5c06108335c9156379f7a8ea" Feb 19 10:11:56 crc kubenswrapper[4873]: I0219 10:11:56.816006 4873 scope.go:117] "RemoveContainer" containerID="7cdae30c7bd3b7746068a98e9abd29f90395d96b144f2365d0ebb1da465756e0" Feb 19 10:11:56 crc kubenswrapper[4873]: I0219 10:11:56.842582 4873 scope.go:117] "RemoveContainer" containerID="2be1eaacedf333b387e3ffd6dce5223b73f9487c48808cd68df4b60a3f55fd39" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.637401 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z6dkm"] Feb 19 10:12:01 crc kubenswrapper[4873]: E0219 10:12:01.638809 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a48b6f-c0a6-4566-82a1-649e91bcd486" containerName="extract-content" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.638827 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a48b6f-c0a6-4566-82a1-649e91bcd486" containerName="extract-content" Feb 19 10:12:01 crc kubenswrapper[4873]: E0219 10:12:01.638842 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a48b6f-c0a6-4566-82a1-649e91bcd486" containerName="extract-utilities" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.638849 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a48b6f-c0a6-4566-82a1-649e91bcd486" containerName="extract-utilities" Feb 19 10:12:01 crc kubenswrapper[4873]: E0219 10:12:01.638865 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a48b6f-c0a6-4566-82a1-649e91bcd486" containerName="registry-server" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.638872 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a48b6f-c0a6-4566-82a1-649e91bcd486" containerName="registry-server" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.639127 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a48b6f-c0a6-4566-82a1-649e91bcd486" containerName="registry-server" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.641138 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.671294 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6dkm"] Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.800388 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-utilities\") pod \"redhat-operators-z6dkm\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.800562 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-catalog-content\") pod \"redhat-operators-z6dkm\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.800681 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g88c8\" (UniqueName: \"kubernetes.io/projected/d83cc040-5619-46ef-9e78-b7b1f1117e79-kube-api-access-g88c8\") pod \"redhat-operators-z6dkm\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.902620 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g88c8\" (UniqueName: \"kubernetes.io/projected/d83cc040-5619-46ef-9e78-b7b1f1117e79-kube-api-access-g88c8\") pod \"redhat-operators-z6dkm\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.902694 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-utilities\") pod \"redhat-operators-z6dkm\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.902810 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-catalog-content\") pod \"redhat-operators-z6dkm\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.903375 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-utilities\") pod \"redhat-operators-z6dkm\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.903410 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-catalog-content\") pod \"redhat-operators-z6dkm\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.926675 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g88c8\" (UniqueName: \"kubernetes.io/projected/d83cc040-5619-46ef-9e78-b7b1f1117e79-kube-api-access-g88c8\") pod \"redhat-operators-z6dkm\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:01 crc kubenswrapper[4873]: I0219 10:12:01.977492 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.090182 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-w65h5"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.106183 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-f5jnw"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.119362 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-73d0-account-create-update-vwj8q"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.154786 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-w65h5"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.166769 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-45f0-account-create-update-b4rvj"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.207387 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-63c5-account-create-update-rqftk"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.262431 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-nfk5h"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.283738 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-63c5-account-create-update-rqftk"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.307237 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-nfk5h"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.321266 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-45f0-account-create-update-b4rvj"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.338924 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-f5jnw"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.351514 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-73d0-account-create-update-vwj8q"] Feb 19 10:12:02 crc kubenswrapper[4873]: I0219 10:12:02.574532 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6dkm"] Feb 19 10:12:03 crc kubenswrapper[4873]: I0219 10:12:03.200168 4873 generic.go:334] "Generic (PLEG): container finished" podID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerID="5b4b0b2b44266488a6c72bad150fbb6843f17c14bdc929c263ca20c5920041bd" exitCode=0 Feb 19 10:12:03 crc kubenswrapper[4873]: I0219 10:12:03.200563 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6dkm" event={"ID":"d83cc040-5619-46ef-9e78-b7b1f1117e79","Type":"ContainerDied","Data":"5b4b0b2b44266488a6c72bad150fbb6843f17c14bdc929c263ca20c5920041bd"} Feb 19 10:12:03 crc kubenswrapper[4873]: I0219 10:12:03.200602 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6dkm" event={"ID":"d83cc040-5619-46ef-9e78-b7b1f1117e79","Type":"ContainerStarted","Data":"cc423b5accdb7784192d7c3d495f5ef39826afe8978f935d56b7fa1b762d6316"} Feb 19 10:12:03 crc kubenswrapper[4873]: I0219 10:12:03.496474 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="679f69ef-9960-4e33-a6aa-09baefabc417" path="/var/lib/kubelet/pods/679f69ef-9960-4e33-a6aa-09baefabc417/volumes" Feb 19 10:12:03 crc kubenswrapper[4873]: I0219 10:12:03.498442 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c35b26-7dc1-4cea-bbe7-53a9e47df7ba" path="/var/lib/kubelet/pods/94c35b26-7dc1-4cea-bbe7-53a9e47df7ba/volumes" Feb 19 10:12:03 crc kubenswrapper[4873]: I0219 10:12:03.500399 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf0daf0d-c150-49de-98af-3f65dd78112f" path="/var/lib/kubelet/pods/bf0daf0d-c150-49de-98af-3f65dd78112f/volumes" Feb 19 10:12:03 crc kubenswrapper[4873]: I0219 10:12:03.500965 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f97f25-d006-40d7-a090-ab45ab11b282" path="/var/lib/kubelet/pods/e1f97f25-d006-40d7-a090-ab45ab11b282/volumes" Feb 19 10:12:03 crc kubenswrapper[4873]: I0219 10:12:03.504605 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec120760-bb10-44ff-bbb0-ed1665b4e17b" path="/var/lib/kubelet/pods/ec120760-bb10-44ff-bbb0-ed1665b4e17b/volumes" Feb 19 10:12:03 crc kubenswrapper[4873]: I0219 10:12:03.506456 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd7769ae-caf0-4f62-be96-90d6fa334259" path="/var/lib/kubelet/pods/fd7769ae-caf0-4f62-be96-90d6fa334259/volumes" Feb 19 10:12:04 crc kubenswrapper[4873]: I0219 10:12:04.484457 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:12:04 crc kubenswrapper[4873]: E0219 10:12:04.485229 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:12:05 crc kubenswrapper[4873]: I0219 10:12:05.226308 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6dkm" event={"ID":"d83cc040-5619-46ef-9e78-b7b1f1117e79","Type":"ContainerStarted","Data":"e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82"} Feb 19 10:12:08 crc kubenswrapper[4873]: I0219 10:12:08.260877 4873 generic.go:334] "Generic (PLEG): container finished" podID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerID="e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82" exitCode=0 Feb 19 10:12:08 crc kubenswrapper[4873]: I0219 10:12:08.260947 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6dkm" event={"ID":"d83cc040-5619-46ef-9e78-b7b1f1117e79","Type":"ContainerDied","Data":"e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82"} Feb 19 10:12:09 crc kubenswrapper[4873]: I0219 10:12:09.036979 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-k6j2h"] Feb 19 10:12:09 crc kubenswrapper[4873]: I0219 10:12:09.048476 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-k6j2h"] Feb 19 10:12:09 crc kubenswrapper[4873]: I0219 10:12:09.496397 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a075072a-1153-4963-91c7-e9e2aa08f988" path="/var/lib/kubelet/pods/a075072a-1153-4963-91c7-e9e2aa08f988/volumes" Feb 19 10:12:10 crc kubenswrapper[4873]: I0219 10:12:10.028621 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-86n9s"] Feb 19 10:12:10 crc kubenswrapper[4873]: I0219 10:12:10.037157 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-86n9s"] Feb 19 10:12:10 crc kubenswrapper[4873]: I0219 10:12:10.283239 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6dkm" event={"ID":"d83cc040-5619-46ef-9e78-b7b1f1117e79","Type":"ContainerStarted","Data":"c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4"} Feb 19 10:12:10 crc kubenswrapper[4873]: I0219 10:12:10.305812 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z6dkm" podStartSLOduration=3.039666542 podStartE2EDuration="9.305789728s" podCreationTimestamp="2026-02-19 10:12:01 +0000 UTC" firstStartedPulling="2026-02-19 10:12:03.202526993 +0000 UTC m=+1632.491958631" lastFinishedPulling="2026-02-19 10:12:09.468650179 +0000 UTC m=+1638.758081817" observedRunningTime="2026-02-19 10:12:10.302617528 +0000 UTC m=+1639.592049186" watchObservedRunningTime="2026-02-19 10:12:10.305789728 +0000 UTC m=+1639.595221366" Feb 19 10:12:11 crc kubenswrapper[4873]: I0219 10:12:11.496308 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="735c003d-082d-431f-9906-20c8946f1bf4" path="/var/lib/kubelet/pods/735c003d-082d-431f-9906-20c8946f1bf4/volumes" Feb 19 10:12:11 crc kubenswrapper[4873]: I0219 10:12:11.978394 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:11 crc kubenswrapper[4873]: I0219 10:12:11.978456 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:13 crc kubenswrapper[4873]: I0219 10:12:13.025795 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z6dkm" podUID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerName="registry-server" probeResult="failure" output=< Feb 19 10:12:13 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 10:12:13 crc kubenswrapper[4873]: > Feb 19 10:12:14 crc kubenswrapper[4873]: I0219 10:12:14.321457 4873 generic.go:334] "Generic (PLEG): container finished" podID="ab7d5a49-ac61-4963-8766-1716098f3d4c" containerID="5a71dd5b4760261dd2be0e41411c70e6162e350c368db3bf6d1ef5a664d01e28" exitCode=0 Feb 19 10:12:14 crc kubenswrapper[4873]: I0219 10:12:14.321543 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" event={"ID":"ab7d5a49-ac61-4963-8766-1716098f3d4c","Type":"ContainerDied","Data":"5a71dd5b4760261dd2be0e41411c70e6162e350c368db3bf6d1ef5a664d01e28"} Feb 19 10:12:15 crc kubenswrapper[4873]: I0219 10:12:15.752733 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:12:15 crc kubenswrapper[4873]: I0219 10:12:15.808794 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-ssh-key-openstack-edpm-ipam\") pod \"ab7d5a49-ac61-4963-8766-1716098f3d4c\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " Feb 19 10:12:15 crc kubenswrapper[4873]: I0219 10:12:15.808967 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-inventory\") pod \"ab7d5a49-ac61-4963-8766-1716098f3d4c\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " Feb 19 10:12:15 crc kubenswrapper[4873]: I0219 10:12:15.809004 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj8jp\" (UniqueName: \"kubernetes.io/projected/ab7d5a49-ac61-4963-8766-1716098f3d4c-kube-api-access-cj8jp\") pod \"ab7d5a49-ac61-4963-8766-1716098f3d4c\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " Feb 19 10:12:15 crc kubenswrapper[4873]: I0219 10:12:15.815374 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab7d5a49-ac61-4963-8766-1716098f3d4c-kube-api-access-cj8jp" (OuterVolumeSpecName: "kube-api-access-cj8jp") pod "ab7d5a49-ac61-4963-8766-1716098f3d4c" (UID: "ab7d5a49-ac61-4963-8766-1716098f3d4c"). InnerVolumeSpecName "kube-api-access-cj8jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:12:15 crc kubenswrapper[4873]: E0219 10:12:15.848517 4873 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-inventory podName:ab7d5a49-ac61-4963-8766-1716098f3d4c nodeName:}" failed. No retries permitted until 2026-02-19 10:12:16.348489985 +0000 UTC m=+1645.637921623 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-inventory") pod "ab7d5a49-ac61-4963-8766-1716098f3d4c" (UID: "ab7d5a49-ac61-4963-8766-1716098f3d4c") : error deleting /var/lib/kubelet/pods/ab7d5a49-ac61-4963-8766-1716098f3d4c/volume-subpaths: remove /var/lib/kubelet/pods/ab7d5a49-ac61-4963-8766-1716098f3d4c/volume-subpaths: no such file or directory Feb 19 10:12:15 crc kubenswrapper[4873]: I0219 10:12:15.852676 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ab7d5a49-ac61-4963-8766-1716098f3d4c" (UID: "ab7d5a49-ac61-4963-8766-1716098f3d4c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:12:15 crc kubenswrapper[4873]: I0219 10:12:15.911307 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:12:15 crc kubenswrapper[4873]: I0219 10:12:15.911344 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj8jp\" (UniqueName: \"kubernetes.io/projected/ab7d5a49-ac61-4963-8766-1716098f3d4c-kube-api-access-cj8jp\") on node \"crc\" DevicePath \"\"" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.340154 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" event={"ID":"ab7d5a49-ac61-4963-8766-1716098f3d4c","Type":"ContainerDied","Data":"104a5f4bca47feaf4bdaf13262521e9dce1f7e82a92383ad952b55930e0a5622"} Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.340520 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="104a5f4bca47feaf4bdaf13262521e9dce1f7e82a92383ad952b55930e0a5622" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.340213 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.419493 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-inventory\") pod \"ab7d5a49-ac61-4963-8766-1716098f3d4c\" (UID: \"ab7d5a49-ac61-4963-8766-1716098f3d4c\") " Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.423942 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-inventory" (OuterVolumeSpecName: "inventory") pod "ab7d5a49-ac61-4963-8766-1716098f3d4c" (UID: "ab7d5a49-ac61-4963-8766-1716098f3d4c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.441687 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b"] Feb 19 10:12:16 crc kubenswrapper[4873]: E0219 10:12:16.442122 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7d5a49-ac61-4963-8766-1716098f3d4c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.442138 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7d5a49-ac61-4963-8766-1716098f3d4c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.442364 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7d5a49-ac61-4963-8766-1716098f3d4c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.443165 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.464988 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b"] Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.522628 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-snp5b\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.522757 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-snp5b\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.522792 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6ntc\" (UniqueName: \"kubernetes.io/projected/f0739ccd-765a-42c4-89b4-de6adf188e24-kube-api-access-j6ntc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-snp5b\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.524323 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab7d5a49-ac61-4963-8766-1716098f3d4c-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.625798 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-snp5b\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.625893 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-snp5b\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.625924 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6ntc\" (UniqueName: \"kubernetes.io/projected/f0739ccd-765a-42c4-89b4-de6adf188e24-kube-api-access-j6ntc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-snp5b\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.630454 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-snp5b\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.630494 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-snp5b\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.642334 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6ntc\" (UniqueName: \"kubernetes.io/projected/f0739ccd-765a-42c4-89b4-de6adf188e24-kube-api-access-j6ntc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-snp5b\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:12:16 crc kubenswrapper[4873]: I0219 10:12:16.793303 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:12:17 crc kubenswrapper[4873]: I0219 10:12:17.407689 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b"] Feb 19 10:12:18 crc kubenswrapper[4873]: I0219 10:12:18.361449 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" event={"ID":"f0739ccd-765a-42c4-89b4-de6adf188e24","Type":"ContainerStarted","Data":"0f2cd21fa77984269424706bd9843a6592fdd7f0b6272e7e9dfc6450008c946c"} Feb 19 10:12:19 crc kubenswrapper[4873]: I0219 10:12:19.370987 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" event={"ID":"f0739ccd-765a-42c4-89b4-de6adf188e24","Type":"ContainerStarted","Data":"3920eadc9823cf170b0e3ddab307f81fb31ae3300913abfa6bdf114e6376e26e"} Feb 19 10:12:19 crc kubenswrapper[4873]: I0219 10:12:19.391027 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" podStartSLOduration=2.720497514 podStartE2EDuration="3.391004538s" podCreationTimestamp="2026-02-19 10:12:16 +0000 UTC" firstStartedPulling="2026-02-19 10:12:17.419320673 +0000 UTC m=+1646.708752311" lastFinishedPulling="2026-02-19 10:12:18.089827697 +0000 UTC m=+1647.379259335" observedRunningTime="2026-02-19 10:12:19.385691684 +0000 UTC m=+1648.675123332" watchObservedRunningTime="2026-02-19 10:12:19.391004538 +0000 UTC m=+1648.680436186" Feb 19 10:12:19 crc kubenswrapper[4873]: I0219 10:12:19.483829 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:12:19 crc kubenswrapper[4873]: E0219 10:12:19.484176 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:12:22 crc kubenswrapper[4873]: I0219 10:12:22.029935 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:22 crc kubenswrapper[4873]: I0219 10:12:22.076148 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:22 crc kubenswrapper[4873]: I0219 10:12:22.268854 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6dkm"] Feb 19 10:12:23 crc kubenswrapper[4873]: I0219 10:12:23.405811 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z6dkm" podUID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerName="registry-server" containerID="cri-o://c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4" gracePeriod=2 Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.161517 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.279845 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g88c8\" (UniqueName: \"kubernetes.io/projected/d83cc040-5619-46ef-9e78-b7b1f1117e79-kube-api-access-g88c8\") pod \"d83cc040-5619-46ef-9e78-b7b1f1117e79\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.279941 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-catalog-content\") pod \"d83cc040-5619-46ef-9e78-b7b1f1117e79\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.280032 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-utilities\") pod \"d83cc040-5619-46ef-9e78-b7b1f1117e79\" (UID: \"d83cc040-5619-46ef-9e78-b7b1f1117e79\") " Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.280956 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-utilities" (OuterVolumeSpecName: "utilities") pod "d83cc040-5619-46ef-9e78-b7b1f1117e79" (UID: "d83cc040-5619-46ef-9e78-b7b1f1117e79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.281130 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.292338 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d83cc040-5619-46ef-9e78-b7b1f1117e79-kube-api-access-g88c8" (OuterVolumeSpecName: "kube-api-access-g88c8") pod "d83cc040-5619-46ef-9e78-b7b1f1117e79" (UID: "d83cc040-5619-46ef-9e78-b7b1f1117e79"). InnerVolumeSpecName "kube-api-access-g88c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.382507 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g88c8\" (UniqueName: \"kubernetes.io/projected/d83cc040-5619-46ef-9e78-b7b1f1117e79-kube-api-access-g88c8\") on node \"crc\" DevicePath \"\"" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.404286 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d83cc040-5619-46ef-9e78-b7b1f1117e79" (UID: "d83cc040-5619-46ef-9e78-b7b1f1117e79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.417613 4873 generic.go:334] "Generic (PLEG): container finished" podID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerID="c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4" exitCode=0 Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.417652 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6dkm" event={"ID":"d83cc040-5619-46ef-9e78-b7b1f1117e79","Type":"ContainerDied","Data":"c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4"} Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.417679 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6dkm" event={"ID":"d83cc040-5619-46ef-9e78-b7b1f1117e79","Type":"ContainerDied","Data":"cc423b5accdb7784192d7c3d495f5ef39826afe8978f935d56b7fa1b762d6316"} Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.417700 4873 scope.go:117] "RemoveContainer" containerID="c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.417706 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6dkm" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.458078 4873 scope.go:117] "RemoveContainer" containerID="e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.465284 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6dkm"] Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.477797 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z6dkm"] Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.485134 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d83cc040-5619-46ef-9e78-b7b1f1117e79-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.500076 4873 scope.go:117] "RemoveContainer" containerID="5b4b0b2b44266488a6c72bad150fbb6843f17c14bdc929c263ca20c5920041bd" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.538549 4873 scope.go:117] "RemoveContainer" containerID="c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4" Feb 19 10:12:24 crc kubenswrapper[4873]: E0219 10:12:24.539266 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4\": container with ID starting with c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4 not found: ID does not exist" containerID="c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.539306 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4"} err="failed to get container status \"c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4\": rpc error: code = NotFound desc = could not find container \"c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4\": container with ID starting with c09685492dfc4d5d29f71ac9d4f3a6d2840be369e5f6cf0d017ea695c5bf24b4 not found: ID does not exist" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.539359 4873 scope.go:117] "RemoveContainer" containerID="e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82" Feb 19 10:12:24 crc kubenswrapper[4873]: E0219 10:12:24.539870 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82\": container with ID starting with e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82 not found: ID does not exist" containerID="e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.539935 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82"} err="failed to get container status \"e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82\": rpc error: code = NotFound desc = could not find container \"e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82\": container with ID starting with e89706f794f3858742735b50534c9d6b0f3f672b1c1c6818cde4477bbe445d82 not found: ID does not exist" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.539963 4873 scope.go:117] "RemoveContainer" containerID="5b4b0b2b44266488a6c72bad150fbb6843f17c14bdc929c263ca20c5920041bd" Feb 19 10:12:24 crc kubenswrapper[4873]: E0219 10:12:24.540295 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4b0b2b44266488a6c72bad150fbb6843f17c14bdc929c263ca20c5920041bd\": container with ID starting with 5b4b0b2b44266488a6c72bad150fbb6843f17c14bdc929c263ca20c5920041bd not found: ID does not exist" containerID="5b4b0b2b44266488a6c72bad150fbb6843f17c14bdc929c263ca20c5920041bd" Feb 19 10:12:24 crc kubenswrapper[4873]: I0219 10:12:24.540321 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4b0b2b44266488a6c72bad150fbb6843f17c14bdc929c263ca20c5920041bd"} err="failed to get container status \"5b4b0b2b44266488a6c72bad150fbb6843f17c14bdc929c263ca20c5920041bd\": rpc error: code = NotFound desc = could not find container \"5b4b0b2b44266488a6c72bad150fbb6843f17c14bdc929c263ca20c5920041bd\": container with ID starting with 5b4b0b2b44266488a6c72bad150fbb6843f17c14bdc929c263ca20c5920041bd not found: ID does not exist" Feb 19 10:12:25 crc kubenswrapper[4873]: I0219 10:12:25.496864 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d83cc040-5619-46ef-9e78-b7b1f1117e79" path="/var/lib/kubelet/pods/d83cc040-5619-46ef-9e78-b7b1f1117e79/volumes" Feb 19 10:12:32 crc kubenswrapper[4873]: I0219 10:12:32.045474 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9472r"] Feb 19 10:12:32 crc kubenswrapper[4873]: I0219 10:12:32.056430 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9472r"] Feb 19 10:12:32 crc kubenswrapper[4873]: I0219 10:12:32.484920 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:12:32 crc kubenswrapper[4873]: E0219 10:12:32.485204 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:12:33 crc kubenswrapper[4873]: I0219 10:12:33.495960 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba" path="/var/lib/kubelet/pods/7ce0a8b9-b7a8-4ee7-8d68-0e6145ada6ba/volumes" Feb 19 10:12:47 crc kubenswrapper[4873]: I0219 10:12:47.485052 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:12:47 crc kubenswrapper[4873]: E0219 10:12:47.485981 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:12:55 crc kubenswrapper[4873]: I0219 10:12:55.055301 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wrcpc"] Feb 19 10:12:55 crc kubenswrapper[4873]: I0219 10:12:55.068079 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wrcpc"] Feb 19 10:12:55 crc kubenswrapper[4873]: I0219 10:12:55.502998 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58099bc8-1a29-467b-b13d-c0713e42e6c2" path="/var/lib/kubelet/pods/58099bc8-1a29-467b-b13d-c0713e42e6c2/volumes" Feb 19 10:12:57 crc kubenswrapper[4873]: I0219 10:12:57.178342 4873 scope.go:117] "RemoveContainer" containerID="c237b902a14f464df461cba85ac2f3875c00ea9082d53eccd8620f9ff36dbdf1" Feb 19 10:12:57 crc kubenswrapper[4873]: I0219 10:12:57.219376 4873 scope.go:117] "RemoveContainer" containerID="270c7b1e210c60f9930081568a2a368a094d153be46a4131b2c800c6cabb0758" Feb 19 10:12:57 crc kubenswrapper[4873]: I0219 10:12:57.287752 4873 scope.go:117] "RemoveContainer" containerID="f7fcab32e5de37523d8bdcbeaad1ae0eeef4c93525ac32d44d6c45730e393e7a" Feb 19 10:12:57 crc kubenswrapper[4873]: I0219 10:12:57.328567 4873 scope.go:117] "RemoveContainer" containerID="fc653719445b1a5b46a480337ffb17668e9e8b070f487de6554f5c2e305620c3" Feb 19 10:12:57 crc kubenswrapper[4873]: I0219 10:12:57.376924 4873 scope.go:117] "RemoveContainer" containerID="7d9cb5cecd99aa90e0a6558ac0a3e7fa7ae0c94550c983a65f7942335964abac" Feb 19 10:12:57 crc kubenswrapper[4873]: I0219 10:12:57.428665 4873 scope.go:117] "RemoveContainer" containerID="e5eec3e87329724888651bc35b53713711df95ef48801142ac4dd2488284d91d" Feb 19 10:12:57 crc kubenswrapper[4873]: I0219 10:12:57.488755 4873 scope.go:117] "RemoveContainer" containerID="c2bf0fbba2f57f774b52845ad4c11caabc24a49278d94ff2cd48b137e6aeb541" Feb 19 10:12:57 crc kubenswrapper[4873]: I0219 10:12:57.543850 4873 scope.go:117] "RemoveContainer" containerID="d5e694f492487d42cde9591670ae968a23c3523d631e5fc12b8c943bcbf3ca29" Feb 19 10:12:57 crc kubenswrapper[4873]: I0219 10:12:57.592665 4873 scope.go:117] "RemoveContainer" containerID="88965eb31897e7c9f4b9aa04da422e3396b97ead67a5f74aaa92bd82cf049dc5" Feb 19 10:12:57 crc kubenswrapper[4873]: I0219 10:12:57.629073 4873 scope.go:117] "RemoveContainer" containerID="a196b181363d1056b517a86bc31a20f9a28399d782296cb45561ba646a621a77" Feb 19 10:12:58 crc kubenswrapper[4873]: I0219 10:12:58.036041 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-98gbw"] Feb 19 10:12:58 crc kubenswrapper[4873]: I0219 10:12:58.045237 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-98gbw"] Feb 19 10:12:59 crc kubenswrapper[4873]: I0219 10:12:59.034309 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-4pv5z"] Feb 19 10:12:59 crc kubenswrapper[4873]: I0219 10:12:59.044957 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-4pv5z"] Feb 19 10:12:59 crc kubenswrapper[4873]: I0219 10:12:59.496179 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943d069e-6ad4-4411-b937-c4499f0ced6f" path="/var/lib/kubelet/pods/943d069e-6ad4-4411-b937-c4499f0ced6f/volumes" Feb 19 10:12:59 crc kubenswrapper[4873]: I0219 10:12:59.498695 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec5489a2-23e2-4875-a19b-d15b4ad6c8c6" path="/var/lib/kubelet/pods/ec5489a2-23e2-4875-a19b-d15b4ad6c8c6/volumes" Feb 19 10:13:01 crc kubenswrapper[4873]: I0219 10:13:01.493314 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:13:01 crc kubenswrapper[4873]: E0219 10:13:01.493593 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:13:02 crc kubenswrapper[4873]: I0219 10:13:02.031734 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vf762"] Feb 19 10:13:02 crc kubenswrapper[4873]: I0219 10:13:02.040693 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vf762"] Feb 19 10:13:03 crc kubenswrapper[4873]: I0219 10:13:03.500065 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99868e3f-82d7-4f0c-9056-661e95486e6e" path="/var/lib/kubelet/pods/99868e3f-82d7-4f0c-9056-661e95486e6e/volumes" Feb 19 10:13:13 crc kubenswrapper[4873]: I0219 10:13:13.028980 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-gqrb5"] Feb 19 10:13:13 crc kubenswrapper[4873]: I0219 10:13:13.040702 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-gqrb5"] Feb 19 10:13:13 crc kubenswrapper[4873]: I0219 10:13:13.484480 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:13:13 crc kubenswrapper[4873]: E0219 10:13:13.484955 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:13:13 crc kubenswrapper[4873]: I0219 10:13:13.499140 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce5accb4-1da0-4a21-a289-7dba33ad935f" path="/var/lib/kubelet/pods/ce5accb4-1da0-4a21-a289-7dba33ad935f/volumes" Feb 19 10:13:27 crc kubenswrapper[4873]: I0219 10:13:27.484137 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:13:27 crc kubenswrapper[4873]: E0219 10:13:27.484965 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:13:28 crc kubenswrapper[4873]: I0219 10:13:28.047938 4873 generic.go:334] "Generic (PLEG): container finished" podID="f0739ccd-765a-42c4-89b4-de6adf188e24" containerID="3920eadc9823cf170b0e3ddab307f81fb31ae3300913abfa6bdf114e6376e26e" exitCode=0 Feb 19 10:13:28 crc kubenswrapper[4873]: I0219 10:13:28.048005 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" event={"ID":"f0739ccd-765a-42c4-89b4-de6adf188e24","Type":"ContainerDied","Data":"3920eadc9823cf170b0e3ddab307f81fb31ae3300913abfa6bdf114e6376e26e"} Feb 19 10:13:29 crc kubenswrapper[4873]: I0219 10:13:29.486519 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:13:29 crc kubenswrapper[4873]: I0219 10:13:29.627419 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-ssh-key-openstack-edpm-ipam\") pod \"f0739ccd-765a-42c4-89b4-de6adf188e24\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " Feb 19 10:13:29 crc kubenswrapper[4873]: I0219 10:13:29.627767 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-inventory\") pod \"f0739ccd-765a-42c4-89b4-de6adf188e24\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " Feb 19 10:13:29 crc kubenswrapper[4873]: I0219 10:13:29.628143 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6ntc\" (UniqueName: \"kubernetes.io/projected/f0739ccd-765a-42c4-89b4-de6adf188e24-kube-api-access-j6ntc\") pod \"f0739ccd-765a-42c4-89b4-de6adf188e24\" (UID: \"f0739ccd-765a-42c4-89b4-de6adf188e24\") " Feb 19 10:13:29 crc kubenswrapper[4873]: I0219 10:13:29.633490 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0739ccd-765a-42c4-89b4-de6adf188e24-kube-api-access-j6ntc" (OuterVolumeSpecName: "kube-api-access-j6ntc") pod "f0739ccd-765a-42c4-89b4-de6adf188e24" (UID: "f0739ccd-765a-42c4-89b4-de6adf188e24"). InnerVolumeSpecName "kube-api-access-j6ntc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:13:29 crc kubenswrapper[4873]: I0219 10:13:29.657250 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f0739ccd-765a-42c4-89b4-de6adf188e24" (UID: "f0739ccd-765a-42c4-89b4-de6adf188e24"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:13:29 crc kubenswrapper[4873]: I0219 10:13:29.658728 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-inventory" (OuterVolumeSpecName: "inventory") pod "f0739ccd-765a-42c4-89b4-de6adf188e24" (UID: "f0739ccd-765a-42c4-89b4-de6adf188e24"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:13:29 crc kubenswrapper[4873]: I0219 10:13:29.732133 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:13:29 crc kubenswrapper[4873]: I0219 10:13:29.732165 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0739ccd-765a-42c4-89b4-de6adf188e24-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:13:29 crc kubenswrapper[4873]: I0219 10:13:29.732174 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6ntc\" (UniqueName: \"kubernetes.io/projected/f0739ccd-765a-42c4-89b4-de6adf188e24-kube-api-access-j6ntc\") on node \"crc\" DevicePath \"\"" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.067408 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" event={"ID":"f0739ccd-765a-42c4-89b4-de6adf188e24","Type":"ContainerDied","Data":"0f2cd21fa77984269424706bd9843a6592fdd7f0b6272e7e9dfc6450008c946c"} Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.067460 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f2cd21fa77984269424706bd9843a6592fdd7f0b6272e7e9dfc6450008c946c" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.067486 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-snp5b" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.155401 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh"] Feb 19 10:13:30 crc kubenswrapper[4873]: E0219 10:13:30.155828 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0739ccd-765a-42c4-89b4-de6adf188e24" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.155850 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0739ccd-765a-42c4-89b4-de6adf188e24" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 10:13:30 crc kubenswrapper[4873]: E0219 10:13:30.155898 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerName="registry-server" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.155907 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerName="registry-server" Feb 19 10:13:30 crc kubenswrapper[4873]: E0219 10:13:30.155917 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerName="extract-content" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.155924 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerName="extract-content" Feb 19 10:13:30 crc kubenswrapper[4873]: E0219 10:13:30.155942 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerName="extract-utilities" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.155950 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerName="extract-utilities" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.158237 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0739ccd-765a-42c4-89b4-de6adf188e24" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.158448 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d83cc040-5619-46ef-9e78-b7b1f1117e79" containerName="registry-server" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.159915 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.162324 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.162429 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.163039 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.163192 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.178502 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh"] Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.240510 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.240566 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.240667 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-767x5\" (UniqueName: \"kubernetes.io/projected/28f40398-582f-40ed-92b8-2ff5a19d138d-kube-api-access-767x5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.342696 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.343172 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.343219 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-767x5\" (UniqueName: \"kubernetes.io/projected/28f40398-582f-40ed-92b8-2ff5a19d138d-kube-api-access-767x5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.352934 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.363070 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.377524 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-767x5\" (UniqueName: \"kubernetes.io/projected/28f40398-582f-40ed-92b8-2ff5a19d138d-kube-api-access-767x5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:30 crc kubenswrapper[4873]: I0219 10:13:30.479522 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:31 crc kubenswrapper[4873]: I0219 10:13:31.082828 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh"] Feb 19 10:13:31 crc kubenswrapper[4873]: W0219 10:13:31.093163 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28f40398_582f_40ed_92b8_2ff5a19d138d.slice/crio-9b5470c7d29ded12e8a197c8c4cfe7095c9f9b3f53492033dd1b642c2da889c1 WatchSource:0}: Error finding container 9b5470c7d29ded12e8a197c8c4cfe7095c9f9b3f53492033dd1b642c2da889c1: Status 404 returned error can't find the container with id 9b5470c7d29ded12e8a197c8c4cfe7095c9f9b3f53492033dd1b642c2da889c1 Feb 19 10:13:32 crc kubenswrapper[4873]: I0219 10:13:32.089526 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" event={"ID":"28f40398-582f-40ed-92b8-2ff5a19d138d","Type":"ContainerStarted","Data":"db6a1a39d537566ce77f3f1bcf766e8f8b64a23fdef4067758d74440b926bcbf"} Feb 19 10:13:32 crc kubenswrapper[4873]: I0219 10:13:32.089900 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" event={"ID":"28f40398-582f-40ed-92b8-2ff5a19d138d","Type":"ContainerStarted","Data":"9b5470c7d29ded12e8a197c8c4cfe7095c9f9b3f53492033dd1b642c2da889c1"} Feb 19 10:13:32 crc kubenswrapper[4873]: I0219 10:13:32.112244 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" podStartSLOduration=1.7246795050000001 podStartE2EDuration="2.112219667s" podCreationTimestamp="2026-02-19 10:13:30 +0000 UTC" firstStartedPulling="2026-02-19 10:13:31.095896145 +0000 UTC m=+1720.385327783" lastFinishedPulling="2026-02-19 10:13:31.483436307 +0000 UTC m=+1720.772867945" observedRunningTime="2026-02-19 10:13:32.105394695 +0000 UTC m=+1721.394826333" watchObservedRunningTime="2026-02-19 10:13:32.112219667 +0000 UTC m=+1721.401651315" Feb 19 10:13:37 crc kubenswrapper[4873]: I0219 10:13:37.133524 4873 generic.go:334] "Generic (PLEG): container finished" podID="28f40398-582f-40ed-92b8-2ff5a19d138d" containerID="db6a1a39d537566ce77f3f1bcf766e8f8b64a23fdef4067758d74440b926bcbf" exitCode=0 Feb 19 10:13:37 crc kubenswrapper[4873]: I0219 10:13:37.133613 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" event={"ID":"28f40398-582f-40ed-92b8-2ff5a19d138d","Type":"ContainerDied","Data":"db6a1a39d537566ce77f3f1bcf766e8f8b64a23fdef4067758d74440b926bcbf"} Feb 19 10:13:38 crc kubenswrapper[4873]: I0219 10:13:38.569492 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:38 crc kubenswrapper[4873]: I0219 10:13:38.668598 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-ssh-key-openstack-edpm-ipam\") pod \"28f40398-582f-40ed-92b8-2ff5a19d138d\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " Feb 19 10:13:38 crc kubenswrapper[4873]: I0219 10:13:38.668815 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-inventory\") pod \"28f40398-582f-40ed-92b8-2ff5a19d138d\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " Feb 19 10:13:38 crc kubenswrapper[4873]: I0219 10:13:38.668876 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-767x5\" (UniqueName: \"kubernetes.io/projected/28f40398-582f-40ed-92b8-2ff5a19d138d-kube-api-access-767x5\") pod \"28f40398-582f-40ed-92b8-2ff5a19d138d\" (UID: \"28f40398-582f-40ed-92b8-2ff5a19d138d\") " Feb 19 10:13:38 crc kubenswrapper[4873]: I0219 10:13:38.676389 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f40398-582f-40ed-92b8-2ff5a19d138d-kube-api-access-767x5" (OuterVolumeSpecName: "kube-api-access-767x5") pod "28f40398-582f-40ed-92b8-2ff5a19d138d" (UID: "28f40398-582f-40ed-92b8-2ff5a19d138d"). InnerVolumeSpecName "kube-api-access-767x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:13:38 crc kubenswrapper[4873]: I0219 10:13:38.703429 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28f40398-582f-40ed-92b8-2ff5a19d138d" (UID: "28f40398-582f-40ed-92b8-2ff5a19d138d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:13:38 crc kubenswrapper[4873]: I0219 10:13:38.703853 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-inventory" (OuterVolumeSpecName: "inventory") pod "28f40398-582f-40ed-92b8-2ff5a19d138d" (UID: "28f40398-582f-40ed-92b8-2ff5a19d138d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:13:38 crc kubenswrapper[4873]: I0219 10:13:38.772462 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:13:38 crc kubenswrapper[4873]: I0219 10:13:38.772585 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-767x5\" (UniqueName: \"kubernetes.io/projected/28f40398-582f-40ed-92b8-2ff5a19d138d-kube-api-access-767x5\") on node \"crc\" DevicePath \"\"" Feb 19 10:13:38 crc kubenswrapper[4873]: I0219 10:13:38.772655 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28f40398-582f-40ed-92b8-2ff5a19d138d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.153666 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" event={"ID":"28f40398-582f-40ed-92b8-2ff5a19d138d","Type":"ContainerDied","Data":"9b5470c7d29ded12e8a197c8c4cfe7095c9f9b3f53492033dd1b642c2da889c1"} Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.153991 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b5470c7d29ded12e8a197c8c4cfe7095c9f9b3f53492033dd1b642c2da889c1" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.153735 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.232798 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj"] Feb 19 10:13:39 crc kubenswrapper[4873]: E0219 10:13:39.233282 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f40398-582f-40ed-92b8-2ff5a19d138d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.233303 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f40398-582f-40ed-92b8-2ff5a19d138d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.233510 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f40398-582f-40ed-92b8-2ff5a19d138d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.234331 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.237370 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.239772 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.239798 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.240175 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.247659 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj"] Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.384468 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s2jwj\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.384685 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbvd\" (UniqueName: \"kubernetes.io/projected/4b127e45-b09c-4e11-9423-58f1f51effd4-kube-api-access-5qbvd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s2jwj\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.384737 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s2jwj\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.486550 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbvd\" (UniqueName: \"kubernetes.io/projected/4b127e45-b09c-4e11-9423-58f1f51effd4-kube-api-access-5qbvd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s2jwj\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.486646 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s2jwj\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.486739 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s2jwj\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.493698 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s2jwj\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.493862 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s2jwj\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.522917 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbvd\" (UniqueName: \"kubernetes.io/projected/4b127e45-b09c-4e11-9423-58f1f51effd4-kube-api-access-5qbvd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-s2jwj\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:13:39 crc kubenswrapper[4873]: I0219 10:13:39.552130 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:13:40 crc kubenswrapper[4873]: I0219 10:13:40.072843 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj"] Feb 19 10:13:40 crc kubenswrapper[4873]: I0219 10:13:40.163233 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" event={"ID":"4b127e45-b09c-4e11-9423-58f1f51effd4","Type":"ContainerStarted","Data":"43fe58502bbf464c2caec60cd36406ffe3030aecb4ac2b178dc13a054c2d3c80"} Feb 19 10:13:41 crc kubenswrapper[4873]: I0219 10:13:41.173142 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" event={"ID":"4b127e45-b09c-4e11-9423-58f1f51effd4","Type":"ContainerStarted","Data":"10b81f24298519dc0db6df1a4cede50ee4703691c259651a692c79216d96e98e"} Feb 19 10:13:41 crc kubenswrapper[4873]: I0219 10:13:41.197145 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" podStartSLOduration=1.740420973 podStartE2EDuration="2.197124799s" podCreationTimestamp="2026-02-19 10:13:39 +0000 UTC" firstStartedPulling="2026-02-19 10:13:40.081938063 +0000 UTC m=+1729.371369701" lastFinishedPulling="2026-02-19 10:13:40.538641859 +0000 UTC m=+1729.828073527" observedRunningTime="2026-02-19 10:13:41.189298292 +0000 UTC m=+1730.478729930" watchObservedRunningTime="2026-02-19 10:13:41.197124799 +0000 UTC m=+1730.486556437" Feb 19 10:13:42 crc kubenswrapper[4873]: I0219 10:13:42.484314 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:13:42 crc kubenswrapper[4873]: E0219 10:13:42.484826 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:13:48 crc kubenswrapper[4873]: I0219 10:13:48.036495 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-83cd-account-create-update-9h25q"] Feb 19 10:13:48 crc kubenswrapper[4873]: I0219 10:13:48.046836 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-83cd-account-create-update-9h25q"] Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.041713 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-hbt9r"] Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.057292 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5862l"] Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.070377 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-cqfhq"] Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.078515 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d9dc-account-create-update-p9wrt"] Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.087457 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-00fb-account-create-update-4594l"] Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.098006 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-hbt9r"] Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.107637 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-00fb-account-create-update-4594l"] Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.116379 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d9dc-account-create-update-p9wrt"] Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.123524 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5862l"] Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.130879 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-cqfhq"] Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.497093 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3129fa03-2686-49af-a434-341b19fb6661" path="/var/lib/kubelet/pods/3129fa03-2686-49af-a434-341b19fb6661/volumes" Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.498097 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c26aa2d-a8f4-4645-a1b6-055cb88e64d6" path="/var/lib/kubelet/pods/3c26aa2d-a8f4-4645-a1b6-055cb88e64d6/volumes" Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.498875 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79bae2a9-56d6-4292-b84b-c346934e5e08" path="/var/lib/kubelet/pods/79bae2a9-56d6-4292-b84b-c346934e5e08/volumes" Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.499623 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d06337-fba1-4b9c-abbc-02f635fd3bdd" path="/var/lib/kubelet/pods/b1d06337-fba1-4b9c-abbc-02f635fd3bdd/volumes" Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.500980 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0" path="/var/lib/kubelet/pods/bcf2377e-6f0a-4dcb-81ff-6ce950b9c3c0/volumes" Feb 19 10:13:49 crc kubenswrapper[4873]: I0219 10:13:49.501743 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7623a19-7720-48a2-9a09-7c1d9d1acf3a" path="/var/lib/kubelet/pods/c7623a19-7720-48a2-9a09-7c1d9d1acf3a/volumes" Feb 19 10:13:56 crc kubenswrapper[4873]: I0219 10:13:56.484627 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:13:56 crc kubenswrapper[4873]: E0219 10:13:56.485625 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:13:57 crc kubenswrapper[4873]: I0219 10:13:57.928833 4873 scope.go:117] "RemoveContainer" containerID="bd733b3ac9bfee3b4bbee18624c05c5bec301e1fa09008f9d0b7376ff957c31a" Feb 19 10:13:57 crc kubenswrapper[4873]: I0219 10:13:57.964898 4873 scope.go:117] "RemoveContainer" containerID="4f7932193028af20a89fc4d6ec905cbeaeae8f2a0c2eccdd691dcdae0d83a150" Feb 19 10:13:58 crc kubenswrapper[4873]: I0219 10:13:58.004818 4873 scope.go:117] "RemoveContainer" containerID="9a04cebf97180c8ea6d0724c6fe0c31aa2fbc8062f300b3608d26c13788862d9" Feb 19 10:13:58 crc kubenswrapper[4873]: I0219 10:13:58.054509 4873 scope.go:117] "RemoveContainer" containerID="e32aae1cb5da5f588b5186b7220b1239b5386c9e999d9330ceeb577323a9711c" Feb 19 10:13:58 crc kubenswrapper[4873]: I0219 10:13:58.102881 4873 scope.go:117] "RemoveContainer" containerID="e48e8c3f3cd0f5266c11f4e70aa13217161d56e20c834665683a06bd7308e111" Feb 19 10:13:58 crc kubenswrapper[4873]: I0219 10:13:58.158442 4873 scope.go:117] "RemoveContainer" containerID="4b38c8bb0d66f0eab5c9674afb8862e41ff03592dbf71798d72618e652e32219" Feb 19 10:13:58 crc kubenswrapper[4873]: I0219 10:13:58.204508 4873 scope.go:117] "RemoveContainer" containerID="dd8d0b4c8e6c8fa16639b3273dca3bab2c82aa1c797c85d4fed1f4b2808775ab" Feb 19 10:13:58 crc kubenswrapper[4873]: I0219 10:13:58.222646 4873 scope.go:117] "RemoveContainer" containerID="2874d7c078f6aebe4e7f936700ecedd6916a0afc5a2e7ddcc365abe01b70926a" Feb 19 10:13:58 crc kubenswrapper[4873]: I0219 10:13:58.244777 4873 scope.go:117] "RemoveContainer" containerID="ad115d69dacdb41d674a33e1db809c1ccc6821733d5a2f7e41e2ae5cb63809b4" Feb 19 10:13:58 crc kubenswrapper[4873]: I0219 10:13:58.282876 4873 scope.go:117] "RemoveContainer" containerID="22b91ea45d57e3f8ed16da3e5a4058c15af39a6d914075c4521ba6755b03990b" Feb 19 10:14:07 crc kubenswrapper[4873]: I0219 10:14:07.487358 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:14:07 crc kubenswrapper[4873]: E0219 10:14:07.491354 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:14:19 crc kubenswrapper[4873]: I0219 10:14:19.485283 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:14:19 crc kubenswrapper[4873]: E0219 10:14:19.485994 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:14:19 crc kubenswrapper[4873]: I0219 10:14:19.545351 4873 generic.go:334] "Generic (PLEG): container finished" podID="4b127e45-b09c-4e11-9423-58f1f51effd4" containerID="10b81f24298519dc0db6df1a4cede50ee4703691c259651a692c79216d96e98e" exitCode=0 Feb 19 10:14:19 crc kubenswrapper[4873]: I0219 10:14:19.545396 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" event={"ID":"4b127e45-b09c-4e11-9423-58f1f51effd4","Type":"ContainerDied","Data":"10b81f24298519dc0db6df1a4cede50ee4703691c259651a692c79216d96e98e"} Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.009870 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.196460 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-inventory\") pod \"4b127e45-b09c-4e11-9423-58f1f51effd4\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.197125 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qbvd\" (UniqueName: \"kubernetes.io/projected/4b127e45-b09c-4e11-9423-58f1f51effd4-kube-api-access-5qbvd\") pod \"4b127e45-b09c-4e11-9423-58f1f51effd4\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.197204 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-ssh-key-openstack-edpm-ipam\") pod \"4b127e45-b09c-4e11-9423-58f1f51effd4\" (UID: \"4b127e45-b09c-4e11-9423-58f1f51effd4\") " Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.206308 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b127e45-b09c-4e11-9423-58f1f51effd4-kube-api-access-5qbvd" (OuterVolumeSpecName: "kube-api-access-5qbvd") pod "4b127e45-b09c-4e11-9423-58f1f51effd4" (UID: "4b127e45-b09c-4e11-9423-58f1f51effd4"). InnerVolumeSpecName "kube-api-access-5qbvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.222939 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-inventory" (OuterVolumeSpecName: "inventory") pod "4b127e45-b09c-4e11-9423-58f1f51effd4" (UID: "4b127e45-b09c-4e11-9423-58f1f51effd4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.224844 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4b127e45-b09c-4e11-9423-58f1f51effd4" (UID: "4b127e45-b09c-4e11-9423-58f1f51effd4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.299926 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.299961 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qbvd\" (UniqueName: \"kubernetes.io/projected/4b127e45-b09c-4e11-9423-58f1f51effd4-kube-api-access-5qbvd\") on node \"crc\" DevicePath \"\"" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.299971 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b127e45-b09c-4e11-9423-58f1f51effd4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.564511 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" event={"ID":"4b127e45-b09c-4e11-9423-58f1f51effd4","Type":"ContainerDied","Data":"43fe58502bbf464c2caec60cd36406ffe3030aecb4ac2b178dc13a054c2d3c80"} Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.564562 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43fe58502bbf464c2caec60cd36406ffe3030aecb4ac2b178dc13a054c2d3c80" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.564570 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-s2jwj" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.659527 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2"] Feb 19 10:14:21 crc kubenswrapper[4873]: E0219 10:14:21.659909 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b127e45-b09c-4e11-9423-58f1f51effd4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.659929 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b127e45-b09c-4e11-9423-58f1f51effd4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.660122 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b127e45-b09c-4e11-9423-58f1f51effd4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.660842 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.662463 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.662649 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.662761 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.664020 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.694511 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2"] Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.808877 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.809151 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.809211 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz2vh\" (UniqueName: \"kubernetes.io/projected/40ec1f13-0b91-4c7c-a13e-11e60f55e627-kube-api-access-pz2vh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.911814 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.912125 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz2vh\" (UniqueName: \"kubernetes.io/projected/40ec1f13-0b91-4c7c-a13e-11e60f55e627-kube-api-access-pz2vh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.912377 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.919275 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.919722 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.929307 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz2vh\" (UniqueName: \"kubernetes.io/projected/40ec1f13-0b91-4c7c-a13e-11e60f55e627-kube-api-access-pz2vh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:14:21 crc kubenswrapper[4873]: I0219 10:14:21.978201 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:14:22 crc kubenswrapper[4873]: I0219 10:14:22.538255 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2"] Feb 19 10:14:22 crc kubenswrapper[4873]: I0219 10:14:22.574480 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" event={"ID":"40ec1f13-0b91-4c7c-a13e-11e60f55e627","Type":"ContainerStarted","Data":"c0ca2118706be6749415b1d33611a0ef01e91959611996a49fa7429d27412f42"} Feb 19 10:14:23 crc kubenswrapper[4873]: I0219 10:14:23.583854 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" event={"ID":"40ec1f13-0b91-4c7c-a13e-11e60f55e627","Type":"ContainerStarted","Data":"3d4d35c803c524343d799d1966633f50e6268b58e19fac8f6e1497548c00acc7"} Feb 19 10:14:23 crc kubenswrapper[4873]: I0219 10:14:23.605026 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" podStartSLOduration=2.1402025829999998 podStartE2EDuration="2.605008266s" podCreationTimestamp="2026-02-19 10:14:21 +0000 UTC" firstStartedPulling="2026-02-19 10:14:22.544955586 +0000 UTC m=+1771.834387224" lastFinishedPulling="2026-02-19 10:14:23.009761269 +0000 UTC m=+1772.299192907" observedRunningTime="2026-02-19 10:14:23.596074284 +0000 UTC m=+1772.885505922" watchObservedRunningTime="2026-02-19 10:14:23.605008266 +0000 UTC m=+1772.894439904" Feb 19 10:14:24 crc kubenswrapper[4873]: I0219 10:14:24.043303 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgm8t"] Feb 19 10:14:24 crc kubenswrapper[4873]: I0219 10:14:24.054831 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgm8t"] Feb 19 10:14:25 crc kubenswrapper[4873]: I0219 10:14:25.494897 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f8fe617-c1d5-41f8-a23a-eeb88444f620" path="/var/lib/kubelet/pods/2f8fe617-c1d5-41f8-a23a-eeb88444f620/volumes" Feb 19 10:14:32 crc kubenswrapper[4873]: I0219 10:14:32.483667 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:14:32 crc kubenswrapper[4873]: E0219 10:14:32.484194 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:14:46 crc kubenswrapper[4873]: I0219 10:14:46.484949 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:14:46 crc kubenswrapper[4873]: E0219 10:14:46.485657 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:14:48 crc kubenswrapper[4873]: I0219 10:14:48.052591 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xzlnz"] Feb 19 10:14:48 crc kubenswrapper[4873]: I0219 10:14:48.064201 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xzlnz"] Feb 19 10:14:49 crc kubenswrapper[4873]: I0219 10:14:49.496736 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b81c17-9130-4def-8021-e73168601bf6" path="/var/lib/kubelet/pods/54b81c17-9130-4def-8021-e73168601bf6/volumes" Feb 19 10:14:56 crc kubenswrapper[4873]: I0219 10:14:56.024835 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9z5nq"] Feb 19 10:14:56 crc kubenswrapper[4873]: I0219 10:14:56.034253 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9z5nq"] Feb 19 10:14:57 crc kubenswrapper[4873]: I0219 10:14:57.494012 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96fca831-509a-4abd-bb7e-2c0f4704368b" path="/var/lib/kubelet/pods/96fca831-509a-4abd-bb7e-2c0f4704368b/volumes" Feb 19 10:14:58 crc kubenswrapper[4873]: I0219 10:14:58.484649 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:14:58 crc kubenswrapper[4873]: E0219 10:14:58.485028 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:14:58 crc kubenswrapper[4873]: I0219 10:14:58.513872 4873 scope.go:117] "RemoveContainer" containerID="343f5f5d97db66db1963a29b53ef93078842c1069756343de5aa869eb8885cd9" Feb 19 10:14:58 crc kubenswrapper[4873]: I0219 10:14:58.560564 4873 scope.go:117] "RemoveContainer" containerID="c163e1fbae8dfb18e81d4177d941e04ca8d149e8d88a196ee094871f3dd31d8c" Feb 19 10:14:58 crc kubenswrapper[4873]: I0219 10:14:58.629833 4873 scope.go:117] "RemoveContainer" containerID="e9c86902c9a53b767e99a6b86b96ed298fea1e2244a0785fd8c7eeb7d4f69fa7" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.148981 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58"] Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.150944 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.153370 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.154380 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.162566 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58"] Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.165221 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-config-volume\") pod \"collect-profiles-29524935-sjf58\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.165264 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-secret-volume\") pod \"collect-profiles-29524935-sjf58\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.165326 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t4rh\" (UniqueName: \"kubernetes.io/projected/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-kube-api-access-5t4rh\") pod \"collect-profiles-29524935-sjf58\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.268086 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-config-volume\") pod \"collect-profiles-29524935-sjf58\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.268185 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-secret-volume\") pod \"collect-profiles-29524935-sjf58\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.268261 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t4rh\" (UniqueName: \"kubernetes.io/projected/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-kube-api-access-5t4rh\") pod \"collect-profiles-29524935-sjf58\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.269158 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-config-volume\") pod \"collect-profiles-29524935-sjf58\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.277344 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-secret-volume\") pod \"collect-profiles-29524935-sjf58\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.285865 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t4rh\" (UniqueName: \"kubernetes.io/projected/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-kube-api-access-5t4rh\") pod \"collect-profiles-29524935-sjf58\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.506654 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:00 crc kubenswrapper[4873]: I0219 10:15:00.969622 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58"] Feb 19 10:15:01 crc kubenswrapper[4873]: I0219 10:15:01.910810 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" event={"ID":"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580","Type":"ContainerStarted","Data":"51b872a4026735697f0f9cc00b395427fbc06efd93a529d94f5319e0d220778e"} Feb 19 10:15:01 crc kubenswrapper[4873]: I0219 10:15:01.911161 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" event={"ID":"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580","Type":"ContainerStarted","Data":"ed5cb2eb001ffb4cf09c9673a7d4a3d78fcb8812596090a28da42ffffec36654"} Feb 19 10:15:01 crc kubenswrapper[4873]: I0219 10:15:01.934406 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" podStartSLOduration=1.9343864229999999 podStartE2EDuration="1.934386423s" podCreationTimestamp="2026-02-19 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:15:01.927700417 +0000 UTC m=+1811.217132045" watchObservedRunningTime="2026-02-19 10:15:01.934386423 +0000 UTC m=+1811.223818071" Feb 19 10:15:02 crc kubenswrapper[4873]: I0219 10:15:02.923346 4873 generic.go:334] "Generic (PLEG): container finished" podID="fb4ec2bd-4c16-4682-873a-4fbdcc5d9580" containerID="51b872a4026735697f0f9cc00b395427fbc06efd93a529d94f5319e0d220778e" exitCode=0 Feb 19 10:15:02 crc kubenswrapper[4873]: I0219 10:15:02.923392 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" event={"ID":"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580","Type":"ContainerDied","Data":"51b872a4026735697f0f9cc00b395427fbc06efd93a529d94f5319e0d220778e"} Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.244762 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.352762 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-secret-volume\") pod \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.352895 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t4rh\" (UniqueName: \"kubernetes.io/projected/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-kube-api-access-5t4rh\") pod \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.352992 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-config-volume\") pod \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\" (UID: \"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580\") " Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.353984 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-config-volume" (OuterVolumeSpecName: "config-volume") pod "fb4ec2bd-4c16-4682-873a-4fbdcc5d9580" (UID: "fb4ec2bd-4c16-4682-873a-4fbdcc5d9580"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.358747 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-kube-api-access-5t4rh" (OuterVolumeSpecName: "kube-api-access-5t4rh") pod "fb4ec2bd-4c16-4682-873a-4fbdcc5d9580" (UID: "fb4ec2bd-4c16-4682-873a-4fbdcc5d9580"). InnerVolumeSpecName "kube-api-access-5t4rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.359414 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fb4ec2bd-4c16-4682-873a-4fbdcc5d9580" (UID: "fb4ec2bd-4c16-4682-873a-4fbdcc5d9580"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.455490 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t4rh\" (UniqueName: \"kubernetes.io/projected/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-kube-api-access-5t4rh\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.455527 4873 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.455537 4873 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.939471 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" event={"ID":"fb4ec2bd-4c16-4682-873a-4fbdcc5d9580","Type":"ContainerDied","Data":"ed5cb2eb001ffb4cf09c9673a7d4a3d78fcb8812596090a28da42ffffec36654"} Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.939510 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed5cb2eb001ffb4cf09c9673a7d4a3d78fcb8812596090a28da42ffffec36654" Feb 19 10:15:04 crc kubenswrapper[4873]: I0219 10:15:04.939552 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58" Feb 19 10:15:11 crc kubenswrapper[4873]: I0219 10:15:11.490086 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:15:11 crc kubenswrapper[4873]: E0219 10:15:11.490896 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:15:14 crc kubenswrapper[4873]: I0219 10:15:14.021002 4873 generic.go:334] "Generic (PLEG): container finished" podID="40ec1f13-0b91-4c7c-a13e-11e60f55e627" containerID="3d4d35c803c524343d799d1966633f50e6268b58e19fac8f6e1497548c00acc7" exitCode=0 Feb 19 10:15:14 crc kubenswrapper[4873]: I0219 10:15:14.021067 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" event={"ID":"40ec1f13-0b91-4c7c-a13e-11e60f55e627","Type":"ContainerDied","Data":"3d4d35c803c524343d799d1966633f50e6268b58e19fac8f6e1497548c00acc7"} Feb 19 10:15:15 crc kubenswrapper[4873]: I0219 10:15:15.421519 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:15:15 crc kubenswrapper[4873]: I0219 10:15:15.577746 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-ssh-key-openstack-edpm-ipam\") pod \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " Feb 19 10:15:15 crc kubenswrapper[4873]: I0219 10:15:15.577900 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz2vh\" (UniqueName: \"kubernetes.io/projected/40ec1f13-0b91-4c7c-a13e-11e60f55e627-kube-api-access-pz2vh\") pod \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " Feb 19 10:15:15 crc kubenswrapper[4873]: I0219 10:15:15.577942 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-inventory\") pod \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\" (UID: \"40ec1f13-0b91-4c7c-a13e-11e60f55e627\") " Feb 19 10:15:15 crc kubenswrapper[4873]: I0219 10:15:15.584523 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ec1f13-0b91-4c7c-a13e-11e60f55e627-kube-api-access-pz2vh" (OuterVolumeSpecName: "kube-api-access-pz2vh") pod "40ec1f13-0b91-4c7c-a13e-11e60f55e627" (UID: "40ec1f13-0b91-4c7c-a13e-11e60f55e627"). InnerVolumeSpecName "kube-api-access-pz2vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:15:15 crc kubenswrapper[4873]: I0219 10:15:15.608576 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "40ec1f13-0b91-4c7c-a13e-11e60f55e627" (UID: "40ec1f13-0b91-4c7c-a13e-11e60f55e627"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:15:15 crc kubenswrapper[4873]: I0219 10:15:15.631115 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-inventory" (OuterVolumeSpecName: "inventory") pod "40ec1f13-0b91-4c7c-a13e-11e60f55e627" (UID: "40ec1f13-0b91-4c7c-a13e-11e60f55e627"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:15:15 crc kubenswrapper[4873]: I0219 10:15:15.681300 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz2vh\" (UniqueName: \"kubernetes.io/projected/40ec1f13-0b91-4c7c-a13e-11e60f55e627-kube-api-access-pz2vh\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:15 crc kubenswrapper[4873]: I0219 10:15:15.681337 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:15 crc kubenswrapper[4873]: I0219 10:15:15.681353 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40ec1f13-0b91-4c7c-a13e-11e60f55e627-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.045670 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" event={"ID":"40ec1f13-0b91-4c7c-a13e-11e60f55e627","Type":"ContainerDied","Data":"c0ca2118706be6749415b1d33611a0ef01e91959611996a49fa7429d27412f42"} Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.045719 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0ca2118706be6749415b1d33611a0ef01e91959611996a49fa7429d27412f42" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.045780 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2" Feb 19 10:15:16 crc kubenswrapper[4873]: E0219 10:15:16.154954 4873 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40ec1f13_0b91_4c7c_a13e_11e60f55e627.slice\": RecentStats: unable to find data in memory cache]" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.220060 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sw8hj"] Feb 19 10:15:16 crc kubenswrapper[4873]: E0219 10:15:16.221228 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ec1f13-0b91-4c7c-a13e-11e60f55e627" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.221327 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ec1f13-0b91-4c7c-a13e-11e60f55e627" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:15:16 crc kubenswrapper[4873]: E0219 10:15:16.221429 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb4ec2bd-4c16-4682-873a-4fbdcc5d9580" containerName="collect-profiles" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.221527 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb4ec2bd-4c16-4682-873a-4fbdcc5d9580" containerName="collect-profiles" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.222018 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ec1f13-0b91-4c7c-a13e-11e60f55e627" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.222179 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb4ec2bd-4c16-4682-873a-4fbdcc5d9580" containerName="collect-profiles" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.223378 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.225594 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.225763 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.226220 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.226223 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.233010 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sw8hj"] Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.393571 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sw8hj\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.393709 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sw8hj\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.393809 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px675\" (UniqueName: \"kubernetes.io/projected/15999617-f2b4-4a3f-911d-422db799fa37-kube-api-access-px675\") pod \"ssh-known-hosts-edpm-deployment-sw8hj\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.495723 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sw8hj\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.495788 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sw8hj\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.495821 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px675\" (UniqueName: \"kubernetes.io/projected/15999617-f2b4-4a3f-911d-422db799fa37-kube-api-access-px675\") pod \"ssh-known-hosts-edpm-deployment-sw8hj\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.500931 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-sw8hj\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.501490 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-sw8hj\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.512915 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px675\" (UniqueName: \"kubernetes.io/projected/15999617-f2b4-4a3f-911d-422db799fa37-kube-api-access-px675\") pod \"ssh-known-hosts-edpm-deployment-sw8hj\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:16 crc kubenswrapper[4873]: I0219 10:15:16.539242 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:17 crc kubenswrapper[4873]: I0219 10:15:17.077908 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-sw8hj"] Feb 19 10:15:18 crc kubenswrapper[4873]: I0219 10:15:18.067536 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" event={"ID":"15999617-f2b4-4a3f-911d-422db799fa37","Type":"ContainerStarted","Data":"13e39142bcaa9f5eb7b0f72d48b915ecda145dd5430ccf135924bd5a76f7486b"} Feb 19 10:15:18 crc kubenswrapper[4873]: I0219 10:15:18.068218 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" event={"ID":"15999617-f2b4-4a3f-911d-422db799fa37","Type":"ContainerStarted","Data":"dedd5cc5a97e635d8fde4448e433d1bd645acff522100d4963d6d3e8d6e972fc"} Feb 19 10:15:18 crc kubenswrapper[4873]: I0219 10:15:18.092614 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" podStartSLOduration=1.700058624 podStartE2EDuration="2.092580958s" podCreationTimestamp="2026-02-19 10:15:16 +0000 UTC" firstStartedPulling="2026-02-19 10:15:17.087756362 +0000 UTC m=+1826.377188010" lastFinishedPulling="2026-02-19 10:15:17.480278706 +0000 UTC m=+1826.769710344" observedRunningTime="2026-02-19 10:15:18.089587033 +0000 UTC m=+1827.379018721" watchObservedRunningTime="2026-02-19 10:15:18.092580958 +0000 UTC m=+1827.382012636" Feb 19 10:15:23 crc kubenswrapper[4873]: I0219 10:15:23.483945 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:15:24 crc kubenswrapper[4873]: I0219 10:15:24.120299 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"4ac19c5656812709276b88acb63ac96a06120ceef6f1ea4e7a6c41a75ff13fe5"} Feb 19 10:15:25 crc kubenswrapper[4873]: I0219 10:15:25.129945 4873 generic.go:334] "Generic (PLEG): container finished" podID="15999617-f2b4-4a3f-911d-422db799fa37" containerID="13e39142bcaa9f5eb7b0f72d48b915ecda145dd5430ccf135924bd5a76f7486b" exitCode=0 Feb 19 10:15:25 crc kubenswrapper[4873]: I0219 10:15:25.130043 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" event={"ID":"15999617-f2b4-4a3f-911d-422db799fa37","Type":"ContainerDied","Data":"13e39142bcaa9f5eb7b0f72d48b915ecda145dd5430ccf135924bd5a76f7486b"} Feb 19 10:15:26 crc kubenswrapper[4873]: I0219 10:15:26.538237 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:26 crc kubenswrapper[4873]: I0219 10:15:26.688858 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px675\" (UniqueName: \"kubernetes.io/projected/15999617-f2b4-4a3f-911d-422db799fa37-kube-api-access-px675\") pod \"15999617-f2b4-4a3f-911d-422db799fa37\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " Feb 19 10:15:26 crc kubenswrapper[4873]: I0219 10:15:26.688951 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-ssh-key-openstack-edpm-ipam\") pod \"15999617-f2b4-4a3f-911d-422db799fa37\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " Feb 19 10:15:26 crc kubenswrapper[4873]: I0219 10:15:26.689143 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-inventory-0\") pod \"15999617-f2b4-4a3f-911d-422db799fa37\" (UID: \"15999617-f2b4-4a3f-911d-422db799fa37\") " Feb 19 10:15:26 crc kubenswrapper[4873]: I0219 10:15:26.694714 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15999617-f2b4-4a3f-911d-422db799fa37-kube-api-access-px675" (OuterVolumeSpecName: "kube-api-access-px675") pod "15999617-f2b4-4a3f-911d-422db799fa37" (UID: "15999617-f2b4-4a3f-911d-422db799fa37"). InnerVolumeSpecName "kube-api-access-px675". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:15:26 crc kubenswrapper[4873]: I0219 10:15:26.715755 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "15999617-f2b4-4a3f-911d-422db799fa37" (UID: "15999617-f2b4-4a3f-911d-422db799fa37"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:15:26 crc kubenswrapper[4873]: I0219 10:15:26.730865 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "15999617-f2b4-4a3f-911d-422db799fa37" (UID: "15999617-f2b4-4a3f-911d-422db799fa37"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:15:26 crc kubenswrapper[4873]: I0219 10:15:26.792189 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px675\" (UniqueName: \"kubernetes.io/projected/15999617-f2b4-4a3f-911d-422db799fa37-kube-api-access-px675\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:26 crc kubenswrapper[4873]: I0219 10:15:26.792276 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:26 crc kubenswrapper[4873]: I0219 10:15:26.792293 4873 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/15999617-f2b4-4a3f-911d-422db799fa37-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.148790 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" event={"ID":"15999617-f2b4-4a3f-911d-422db799fa37","Type":"ContainerDied","Data":"dedd5cc5a97e635d8fde4448e433d1bd645acff522100d4963d6d3e8d6e972fc"} Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.148831 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dedd5cc5a97e635d8fde4448e433d1bd645acff522100d4963d6d3e8d6e972fc" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.149163 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-sw8hj" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.235445 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf"] Feb 19 10:15:27 crc kubenswrapper[4873]: E0219 10:15:27.235871 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15999617-f2b4-4a3f-911d-422db799fa37" containerName="ssh-known-hosts-edpm-deployment" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.235914 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="15999617-f2b4-4a3f-911d-422db799fa37" containerName="ssh-known-hosts-edpm-deployment" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.236208 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="15999617-f2b4-4a3f-911d-422db799fa37" containerName="ssh-known-hosts-edpm-deployment" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.236860 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.239354 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.239466 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.240093 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.243859 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.259168 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf"] Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.406383 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6bl7\" (UniqueName: \"kubernetes.io/projected/7843f72c-5559-44d6-86e0-62f013e0a073-kube-api-access-f6bl7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wvjf\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.406473 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wvjf\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.406597 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wvjf\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.508096 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wvjf\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.508255 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6bl7\" (UniqueName: \"kubernetes.io/projected/7843f72c-5559-44d6-86e0-62f013e0a073-kube-api-access-f6bl7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wvjf\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.508312 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wvjf\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.513466 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wvjf\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.514630 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wvjf\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.526392 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6bl7\" (UniqueName: \"kubernetes.io/projected/7843f72c-5559-44d6-86e0-62f013e0a073-kube-api-access-f6bl7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5wvjf\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:27 crc kubenswrapper[4873]: I0219 10:15:27.554829 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:28 crc kubenswrapper[4873]: I0219 10:15:28.088833 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf"] Feb 19 10:15:28 crc kubenswrapper[4873]: W0219 10:15:28.096790 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7843f72c_5559_44d6_86e0_62f013e0a073.slice/crio-bcbad2c4ee95dd8d3c820ca667fce5604d8315d212f86381338d0901a7881181 WatchSource:0}: Error finding container bcbad2c4ee95dd8d3c820ca667fce5604d8315d212f86381338d0901a7881181: Status 404 returned error can't find the container with id bcbad2c4ee95dd8d3c820ca667fce5604d8315d212f86381338d0901a7881181 Feb 19 10:15:28 crc kubenswrapper[4873]: I0219 10:15:28.157776 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" event={"ID":"7843f72c-5559-44d6-86e0-62f013e0a073","Type":"ContainerStarted","Data":"bcbad2c4ee95dd8d3c820ca667fce5604d8315d212f86381338d0901a7881181"} Feb 19 10:15:30 crc kubenswrapper[4873]: I0219 10:15:30.228223 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" event={"ID":"7843f72c-5559-44d6-86e0-62f013e0a073","Type":"ContainerStarted","Data":"dfffb5690d0bc3ab31c1e460684b09eb768ee575f66111d0e554420440e9c976"} Feb 19 10:15:30 crc kubenswrapper[4873]: I0219 10:15:30.251619 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" podStartSLOduration=2.33593736 podStartE2EDuration="3.251593799s" podCreationTimestamp="2026-02-19 10:15:27 +0000 UTC" firstStartedPulling="2026-02-19 10:15:28.099509622 +0000 UTC m=+1837.388941260" lastFinishedPulling="2026-02-19 10:15:29.015166061 +0000 UTC m=+1838.304597699" observedRunningTime="2026-02-19 10:15:30.2500409 +0000 UTC m=+1839.539472578" watchObservedRunningTime="2026-02-19 10:15:30.251593799 +0000 UTC m=+1839.541025457" Feb 19 10:15:33 crc kubenswrapper[4873]: I0219 10:15:33.055439 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ljn4d"] Feb 19 10:15:33 crc kubenswrapper[4873]: I0219 10:15:33.063400 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ljn4d"] Feb 19 10:15:33 crc kubenswrapper[4873]: I0219 10:15:33.497093 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="355c3bd2-5fb4-4a28-be15-e766b61eeed9" path="/var/lib/kubelet/pods/355c3bd2-5fb4-4a28-be15-e766b61eeed9/volumes" Feb 19 10:15:37 crc kubenswrapper[4873]: I0219 10:15:37.290661 4873 generic.go:334] "Generic (PLEG): container finished" podID="7843f72c-5559-44d6-86e0-62f013e0a073" containerID="dfffb5690d0bc3ab31c1e460684b09eb768ee575f66111d0e554420440e9c976" exitCode=0 Feb 19 10:15:37 crc kubenswrapper[4873]: I0219 10:15:37.290731 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" event={"ID":"7843f72c-5559-44d6-86e0-62f013e0a073","Type":"ContainerDied","Data":"dfffb5690d0bc3ab31c1e460684b09eb768ee575f66111d0e554420440e9c976"} Feb 19 10:15:38 crc kubenswrapper[4873]: I0219 10:15:38.749092 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:38 crc kubenswrapper[4873]: I0219 10:15:38.848851 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6bl7\" (UniqueName: \"kubernetes.io/projected/7843f72c-5559-44d6-86e0-62f013e0a073-kube-api-access-f6bl7\") pod \"7843f72c-5559-44d6-86e0-62f013e0a073\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " Feb 19 10:15:38 crc kubenswrapper[4873]: I0219 10:15:38.849006 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-inventory\") pod \"7843f72c-5559-44d6-86e0-62f013e0a073\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " Feb 19 10:15:38 crc kubenswrapper[4873]: I0219 10:15:38.849174 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-ssh-key-openstack-edpm-ipam\") pod \"7843f72c-5559-44d6-86e0-62f013e0a073\" (UID: \"7843f72c-5559-44d6-86e0-62f013e0a073\") " Feb 19 10:15:38 crc kubenswrapper[4873]: I0219 10:15:38.854782 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7843f72c-5559-44d6-86e0-62f013e0a073-kube-api-access-f6bl7" (OuterVolumeSpecName: "kube-api-access-f6bl7") pod "7843f72c-5559-44d6-86e0-62f013e0a073" (UID: "7843f72c-5559-44d6-86e0-62f013e0a073"). InnerVolumeSpecName "kube-api-access-f6bl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:15:38 crc kubenswrapper[4873]: I0219 10:15:38.878478 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7843f72c-5559-44d6-86e0-62f013e0a073" (UID: "7843f72c-5559-44d6-86e0-62f013e0a073"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:15:38 crc kubenswrapper[4873]: I0219 10:15:38.878836 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-inventory" (OuterVolumeSpecName: "inventory") pod "7843f72c-5559-44d6-86e0-62f013e0a073" (UID: "7843f72c-5559-44d6-86e0-62f013e0a073"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:15:38 crc kubenswrapper[4873]: I0219 10:15:38.951968 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6bl7\" (UniqueName: \"kubernetes.io/projected/7843f72c-5559-44d6-86e0-62f013e0a073-kube-api-access-f6bl7\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:38 crc kubenswrapper[4873]: I0219 10:15:38.952051 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:38 crc kubenswrapper[4873]: I0219 10:15:38.952080 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7843f72c-5559-44d6-86e0-62f013e0a073-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.317342 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" event={"ID":"7843f72c-5559-44d6-86e0-62f013e0a073","Type":"ContainerDied","Data":"bcbad2c4ee95dd8d3c820ca667fce5604d8315d212f86381338d0901a7881181"} Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.317387 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcbad2c4ee95dd8d3c820ca667fce5604d8315d212f86381338d0901a7881181" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.317449 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5wvjf" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.387857 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj"] Feb 19 10:15:39 crc kubenswrapper[4873]: E0219 10:15:39.388356 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7843f72c-5559-44d6-86e0-62f013e0a073" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.388382 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="7843f72c-5559-44d6-86e0-62f013e0a073" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.388634 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="7843f72c-5559-44d6-86e0-62f013e0a073" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.389505 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.391665 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.398346 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj"] Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.400449 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.400525 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.400665 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.462557 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.462646 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.462755 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szglk\" (UniqueName: \"kubernetes.io/projected/157ee933-b692-4c92-bcbd-967bc1cd377c-kube-api-access-szglk\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.564348 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szglk\" (UniqueName: \"kubernetes.io/projected/157ee933-b692-4c92-bcbd-967bc1cd377c-kube-api-access-szglk\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.564541 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.564593 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.571876 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.589440 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szglk\" (UniqueName: \"kubernetes.io/projected/157ee933-b692-4c92-bcbd-967bc1cd377c-kube-api-access-szglk\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.590511 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:39 crc kubenswrapper[4873]: I0219 10:15:39.707572 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:40 crc kubenswrapper[4873]: I0219 10:15:40.240512 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj"] Feb 19 10:15:40 crc kubenswrapper[4873]: W0219 10:15:40.245777 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod157ee933_b692_4c92_bcbd_967bc1cd377c.slice/crio-8cfabebb1577b55754b14ab082be646f7ff012789f460922cc89eb4f6a067296 WatchSource:0}: Error finding container 8cfabebb1577b55754b14ab082be646f7ff012789f460922cc89eb4f6a067296: Status 404 returned error can't find the container with id 8cfabebb1577b55754b14ab082be646f7ff012789f460922cc89eb4f6a067296 Feb 19 10:15:40 crc kubenswrapper[4873]: I0219 10:15:40.250044 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 10:15:40 crc kubenswrapper[4873]: I0219 10:15:40.328395 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" event={"ID":"157ee933-b692-4c92-bcbd-967bc1cd377c","Type":"ContainerStarted","Data":"8cfabebb1577b55754b14ab082be646f7ff012789f460922cc89eb4f6a067296"} Feb 19 10:15:41 crc kubenswrapper[4873]: I0219 10:15:41.338900 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" event={"ID":"157ee933-b692-4c92-bcbd-967bc1cd377c","Type":"ContainerStarted","Data":"82218227e4b04484e348e2f28e8d3f15ccea294e26f028723d97bf71c024e437"} Feb 19 10:15:41 crc kubenswrapper[4873]: I0219 10:15:41.361149 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" podStartSLOduration=1.9329847020000002 podStartE2EDuration="2.361131233s" podCreationTimestamp="2026-02-19 10:15:39 +0000 UTC" firstStartedPulling="2026-02-19 10:15:40.249752595 +0000 UTC m=+1849.539184233" lastFinishedPulling="2026-02-19 10:15:40.677899086 +0000 UTC m=+1849.967330764" observedRunningTime="2026-02-19 10:15:41.35499298 +0000 UTC m=+1850.644424618" watchObservedRunningTime="2026-02-19 10:15:41.361131233 +0000 UTC m=+1850.650562881" Feb 19 10:15:50 crc kubenswrapper[4873]: I0219 10:15:50.430390 4873 generic.go:334] "Generic (PLEG): container finished" podID="157ee933-b692-4c92-bcbd-967bc1cd377c" containerID="82218227e4b04484e348e2f28e8d3f15ccea294e26f028723d97bf71c024e437" exitCode=0 Feb 19 10:15:50 crc kubenswrapper[4873]: I0219 10:15:50.430492 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" event={"ID":"157ee933-b692-4c92-bcbd-967bc1cd377c","Type":"ContainerDied","Data":"82218227e4b04484e348e2f28e8d3f15ccea294e26f028723d97bf71c024e437"} Feb 19 10:15:51 crc kubenswrapper[4873]: I0219 10:15:51.893386 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.023006 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-inventory\") pod \"157ee933-b692-4c92-bcbd-967bc1cd377c\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.023138 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-ssh-key-openstack-edpm-ipam\") pod \"157ee933-b692-4c92-bcbd-967bc1cd377c\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.023177 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szglk\" (UniqueName: \"kubernetes.io/projected/157ee933-b692-4c92-bcbd-967bc1cd377c-kube-api-access-szglk\") pod \"157ee933-b692-4c92-bcbd-967bc1cd377c\" (UID: \"157ee933-b692-4c92-bcbd-967bc1cd377c\") " Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.028727 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157ee933-b692-4c92-bcbd-967bc1cd377c-kube-api-access-szglk" (OuterVolumeSpecName: "kube-api-access-szglk") pod "157ee933-b692-4c92-bcbd-967bc1cd377c" (UID: "157ee933-b692-4c92-bcbd-967bc1cd377c"). InnerVolumeSpecName "kube-api-access-szglk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.055886 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "157ee933-b692-4c92-bcbd-967bc1cd377c" (UID: "157ee933-b692-4c92-bcbd-967bc1cd377c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.058901 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-inventory" (OuterVolumeSpecName: "inventory") pod "157ee933-b692-4c92-bcbd-967bc1cd377c" (UID: "157ee933-b692-4c92-bcbd-967bc1cd377c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.127019 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.127098 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szglk\" (UniqueName: \"kubernetes.io/projected/157ee933-b692-4c92-bcbd-967bc1cd377c-kube-api-access-szglk\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.127144 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/157ee933-b692-4c92-bcbd-967bc1cd377c-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.453596 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" event={"ID":"157ee933-b692-4c92-bcbd-967bc1cd377c","Type":"ContainerDied","Data":"8cfabebb1577b55754b14ab082be646f7ff012789f460922cc89eb4f6a067296"} Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.453651 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cfabebb1577b55754b14ab082be646f7ff012789f460922cc89eb4f6a067296" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.453708 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.534422 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6"] Feb 19 10:15:52 crc kubenswrapper[4873]: E0219 10:15:52.535174 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157ee933-b692-4c92-bcbd-967bc1cd377c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.535281 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="157ee933-b692-4c92-bcbd-967bc1cd377c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.535574 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="157ee933-b692-4c92-bcbd-967bc1cd377c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.537724 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.543750 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.543996 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.544227 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.544632 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.545547 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.545852 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.545933 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.546048 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.560405 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6"] Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.637186 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.637249 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.637273 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.637396 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.637661 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.637719 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.637804 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.637867 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.638095 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.638154 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.638225 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.638279 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.638369 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57nr4\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-kube-api-access-57nr4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.638424 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.740775 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.740833 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.740861 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.740887 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.740919 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.740945 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.741482 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.741511 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.741540 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57nr4\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-kube-api-access-57nr4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.741576 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.741658 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.741687 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.741715 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.741752 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.744824 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.744913 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.745702 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.746127 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.747547 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.747573 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.748224 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.748504 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.748541 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.749265 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.749750 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.751189 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.756473 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.758894 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57nr4\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-kube-api-access-57nr4\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:52 crc kubenswrapper[4873]: I0219 10:15:52.862997 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:15:53 crc kubenswrapper[4873]: W0219 10:15:53.386542 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod537c2ac8_0912_4609_ab4e_760060a78d52.slice/crio-19b267f7d9414ea523f984cf39b74596cada12d50893b7c0d4c30582756f8797 WatchSource:0}: Error finding container 19b267f7d9414ea523f984cf39b74596cada12d50893b7c0d4c30582756f8797: Status 404 returned error can't find the container with id 19b267f7d9414ea523f984cf39b74596cada12d50893b7c0d4c30582756f8797 Feb 19 10:15:53 crc kubenswrapper[4873]: I0219 10:15:53.388169 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6"] Feb 19 10:15:53 crc kubenswrapper[4873]: I0219 10:15:53.461818 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" event={"ID":"537c2ac8-0912-4609-ab4e-760060a78d52","Type":"ContainerStarted","Data":"19b267f7d9414ea523f984cf39b74596cada12d50893b7c0d4c30582756f8797"} Feb 19 10:15:54 crc kubenswrapper[4873]: I0219 10:15:54.472442 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" event={"ID":"537c2ac8-0912-4609-ab4e-760060a78d52","Type":"ContainerStarted","Data":"663497054b60d32c21bffa4e04903d9e3273292d4a9fb164931190510b2cc955"} Feb 19 10:15:54 crc kubenswrapper[4873]: I0219 10:15:54.496965 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" podStartSLOduration=2.068288588 podStartE2EDuration="2.496931261s" podCreationTimestamp="2026-02-19 10:15:52 +0000 UTC" firstStartedPulling="2026-02-19 10:15:53.388799175 +0000 UTC m=+1862.678230813" lastFinishedPulling="2026-02-19 10:15:53.817441838 +0000 UTC m=+1863.106873486" observedRunningTime="2026-02-19 10:15:54.494204603 +0000 UTC m=+1863.783636261" watchObservedRunningTime="2026-02-19 10:15:54.496931261 +0000 UTC m=+1863.786362889" Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.700896 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tzxhp"] Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.704615 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.713407 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzxhp"] Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.763756 4873 scope.go:117] "RemoveContainer" containerID="087819f431db46b81166e897b21b99194aa2c81f307651c133879685fcb5a03d" Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.765227 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-catalog-content\") pod \"certified-operators-tzxhp\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.765287 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmr67\" (UniqueName: \"kubernetes.io/projected/d3655910-cb6e-4b54-bd68-48c5ba1551df-kube-api-access-xmr67\") pod \"certified-operators-tzxhp\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.765387 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-utilities\") pod \"certified-operators-tzxhp\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.868381 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-catalog-content\") pod \"certified-operators-tzxhp\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.868831 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmr67\" (UniqueName: \"kubernetes.io/projected/d3655910-cb6e-4b54-bd68-48c5ba1551df-kube-api-access-xmr67\") pod \"certified-operators-tzxhp\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.868866 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-catalog-content\") pod \"certified-operators-tzxhp\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.868890 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-utilities\") pod \"certified-operators-tzxhp\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.869331 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-utilities\") pod \"certified-operators-tzxhp\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:15:58 crc kubenswrapper[4873]: I0219 10:15:58.906626 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmr67\" (UniqueName: \"kubernetes.io/projected/d3655910-cb6e-4b54-bd68-48c5ba1551df-kube-api-access-xmr67\") pod \"certified-operators-tzxhp\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:15:59 crc kubenswrapper[4873]: I0219 10:15:59.038808 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:15:59 crc kubenswrapper[4873]: I0219 10:15:59.583862 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzxhp"] Feb 19 10:16:00 crc kubenswrapper[4873]: I0219 10:16:00.536785 4873 generic.go:334] "Generic (PLEG): container finished" podID="d3655910-cb6e-4b54-bd68-48c5ba1551df" containerID="45c12349a9b65569282146a1d281f342872959e6cda848dedf88f9d430fd517b" exitCode=0 Feb 19 10:16:00 crc kubenswrapper[4873]: I0219 10:16:00.536888 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxhp" event={"ID":"d3655910-cb6e-4b54-bd68-48c5ba1551df","Type":"ContainerDied","Data":"45c12349a9b65569282146a1d281f342872959e6cda848dedf88f9d430fd517b"} Feb 19 10:16:00 crc kubenswrapper[4873]: I0219 10:16:00.537470 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxhp" event={"ID":"d3655910-cb6e-4b54-bd68-48c5ba1551df","Type":"ContainerStarted","Data":"d9e5dcde93903f3ae386da846f1499da1d8e5d8069dd46542cd55299f2b61baa"} Feb 19 10:16:01 crc kubenswrapper[4873]: I0219 10:16:01.547790 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxhp" event={"ID":"d3655910-cb6e-4b54-bd68-48c5ba1551df","Type":"ContainerStarted","Data":"06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2"} Feb 19 10:16:02 crc kubenswrapper[4873]: I0219 10:16:02.559594 4873 generic.go:334] "Generic (PLEG): container finished" podID="d3655910-cb6e-4b54-bd68-48c5ba1551df" containerID="06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2" exitCode=0 Feb 19 10:16:02 crc kubenswrapper[4873]: I0219 10:16:02.559650 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxhp" event={"ID":"d3655910-cb6e-4b54-bd68-48c5ba1551df","Type":"ContainerDied","Data":"06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2"} Feb 19 10:16:03 crc kubenswrapper[4873]: I0219 10:16:03.569727 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxhp" event={"ID":"d3655910-cb6e-4b54-bd68-48c5ba1551df","Type":"ContainerStarted","Data":"e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35"} Feb 19 10:16:03 crc kubenswrapper[4873]: I0219 10:16:03.594816 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tzxhp" podStartSLOduration=3.171231069 podStartE2EDuration="5.594800029s" podCreationTimestamp="2026-02-19 10:15:58 +0000 UTC" firstStartedPulling="2026-02-19 10:16:00.539410611 +0000 UTC m=+1869.828842249" lastFinishedPulling="2026-02-19 10:16:02.962979571 +0000 UTC m=+1872.252411209" observedRunningTime="2026-02-19 10:16:03.592066741 +0000 UTC m=+1872.881498379" watchObservedRunningTime="2026-02-19 10:16:03.594800029 +0000 UTC m=+1872.884231667" Feb 19 10:16:09 crc kubenswrapper[4873]: I0219 10:16:09.039428 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:16:09 crc kubenswrapper[4873]: I0219 10:16:09.040070 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:16:09 crc kubenswrapper[4873]: I0219 10:16:09.094503 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:16:09 crc kubenswrapper[4873]: I0219 10:16:09.666481 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:16:10 crc kubenswrapper[4873]: I0219 10:16:10.291038 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tzxhp"] Feb 19 10:16:11 crc kubenswrapper[4873]: I0219 10:16:11.639248 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tzxhp" podUID="d3655910-cb6e-4b54-bd68-48c5ba1551df" containerName="registry-server" containerID="cri-o://e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35" gracePeriod=2 Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.116671 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.150885 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-utilities\") pod \"d3655910-cb6e-4b54-bd68-48c5ba1551df\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.151020 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-catalog-content\") pod \"d3655910-cb6e-4b54-bd68-48c5ba1551df\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.151267 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmr67\" (UniqueName: \"kubernetes.io/projected/d3655910-cb6e-4b54-bd68-48c5ba1551df-kube-api-access-xmr67\") pod \"d3655910-cb6e-4b54-bd68-48c5ba1551df\" (UID: \"d3655910-cb6e-4b54-bd68-48c5ba1551df\") " Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.151924 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-utilities" (OuterVolumeSpecName: "utilities") pod "d3655910-cb6e-4b54-bd68-48c5ba1551df" (UID: "d3655910-cb6e-4b54-bd68-48c5ba1551df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.162417 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3655910-cb6e-4b54-bd68-48c5ba1551df-kube-api-access-xmr67" (OuterVolumeSpecName: "kube-api-access-xmr67") pod "d3655910-cb6e-4b54-bd68-48c5ba1551df" (UID: "d3655910-cb6e-4b54-bd68-48c5ba1551df"). InnerVolumeSpecName "kube-api-access-xmr67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.253509 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.253546 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmr67\" (UniqueName: \"kubernetes.io/projected/d3655910-cb6e-4b54-bd68-48c5ba1551df-kube-api-access-xmr67\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.651482 4873 generic.go:334] "Generic (PLEG): container finished" podID="d3655910-cb6e-4b54-bd68-48c5ba1551df" containerID="e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35" exitCode=0 Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.651530 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxhp" event={"ID":"d3655910-cb6e-4b54-bd68-48c5ba1551df","Type":"ContainerDied","Data":"e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35"} Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.651555 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzxhp" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.651576 4873 scope.go:117] "RemoveContainer" containerID="e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.651561 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzxhp" event={"ID":"d3655910-cb6e-4b54-bd68-48c5ba1551df","Type":"ContainerDied","Data":"d9e5dcde93903f3ae386da846f1499da1d8e5d8069dd46542cd55299f2b61baa"} Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.677119 4873 scope.go:117] "RemoveContainer" containerID="06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.701865 4873 scope.go:117] "RemoveContainer" containerID="45c12349a9b65569282146a1d281f342872959e6cda848dedf88f9d430fd517b" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.743464 4873 scope.go:117] "RemoveContainer" containerID="e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35" Feb 19 10:16:12 crc kubenswrapper[4873]: E0219 10:16:12.743800 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35\": container with ID starting with e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35 not found: ID does not exist" containerID="e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.743830 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35"} err="failed to get container status \"e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35\": rpc error: code = NotFound desc = could not find container \"e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35\": container with ID starting with e67e42fe1b67208dfa37d54ccd67ab6932304c96229032a8dd4a038c11fc7b35 not found: ID does not exist" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.743852 4873 scope.go:117] "RemoveContainer" containerID="06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2" Feb 19 10:16:12 crc kubenswrapper[4873]: E0219 10:16:12.744062 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2\": container with ID starting with 06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2 not found: ID does not exist" containerID="06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.744114 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2"} err="failed to get container status \"06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2\": rpc error: code = NotFound desc = could not find container \"06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2\": container with ID starting with 06ee0c5b7947af8e68f40c163a4f6ddeac23664eb557936ef15dc5855e1148f2 not found: ID does not exist" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.744136 4873 scope.go:117] "RemoveContainer" containerID="45c12349a9b65569282146a1d281f342872959e6cda848dedf88f9d430fd517b" Feb 19 10:16:12 crc kubenswrapper[4873]: E0219 10:16:12.744486 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45c12349a9b65569282146a1d281f342872959e6cda848dedf88f9d430fd517b\": container with ID starting with 45c12349a9b65569282146a1d281f342872959e6cda848dedf88f9d430fd517b not found: ID does not exist" containerID="45c12349a9b65569282146a1d281f342872959e6cda848dedf88f9d430fd517b" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.744545 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c12349a9b65569282146a1d281f342872959e6cda848dedf88f9d430fd517b"} err="failed to get container status \"45c12349a9b65569282146a1d281f342872959e6cda848dedf88f9d430fd517b\": rpc error: code = NotFound desc = could not find container \"45c12349a9b65569282146a1d281f342872959e6cda848dedf88f9d430fd517b\": container with ID starting with 45c12349a9b65569282146a1d281f342872959e6cda848dedf88f9d430fd517b not found: ID does not exist" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.812596 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3655910-cb6e-4b54-bd68-48c5ba1551df" (UID: "d3655910-cb6e-4b54-bd68-48c5ba1551df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.864149 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3655910-cb6e-4b54-bd68-48c5ba1551df-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.983944 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tzxhp"] Feb 19 10:16:12 crc kubenswrapper[4873]: I0219 10:16:12.992870 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tzxhp"] Feb 19 10:16:13 crc kubenswrapper[4873]: I0219 10:16:13.506914 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3655910-cb6e-4b54-bd68-48c5ba1551df" path="/var/lib/kubelet/pods/d3655910-cb6e-4b54-bd68-48c5ba1551df/volumes" Feb 19 10:16:31 crc kubenswrapper[4873]: I0219 10:16:31.833613 4873 generic.go:334] "Generic (PLEG): container finished" podID="537c2ac8-0912-4609-ab4e-760060a78d52" containerID="663497054b60d32c21bffa4e04903d9e3273292d4a9fb164931190510b2cc955" exitCode=0 Feb 19 10:16:31 crc kubenswrapper[4873]: I0219 10:16:31.833708 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" event={"ID":"537c2ac8-0912-4609-ab4e-760060a78d52","Type":"ContainerDied","Data":"663497054b60d32c21bffa4e04903d9e3273292d4a9fb164931190510b2cc955"} Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.287089 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.451287 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-nova-combined-ca-bundle\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.451383 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-libvirt-combined-ca-bundle\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.451446 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.451478 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-inventory\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.451506 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-repo-setup-combined-ca-bundle\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.451540 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.452173 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.452255 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-neutron-metadata-combined-ca-bundle\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.452303 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ovn-combined-ca-bundle\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.452348 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ssh-key-openstack-edpm-ipam\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.452391 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-telemetry-combined-ca-bundle\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.452453 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57nr4\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-kube-api-access-57nr4\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.452499 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-bootstrap-combined-ca-bundle\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.452526 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-ovn-default-certs-0\") pod \"537c2ac8-0912-4609-ab4e-760060a78d52\" (UID: \"537c2ac8-0912-4609-ab4e-760060a78d52\") " Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.459191 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.459682 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.460153 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.460197 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.462299 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.463422 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.464835 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.465044 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.466913 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.468004 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-kube-api-access-57nr4" (OuterVolumeSpecName: "kube-api-access-57nr4") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "kube-api-access-57nr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.468216 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.476982 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.486928 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.491230 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-inventory" (OuterVolumeSpecName: "inventory") pod "537c2ac8-0912-4609-ab4e-760060a78d52" (UID: "537c2ac8-0912-4609-ab4e-760060a78d52"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554446 4873 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554477 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57nr4\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-kube-api-access-57nr4\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554508 4873 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554519 4873 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554528 4873 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554536 4873 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554544 4873 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554555 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554563 4873 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554573 4873 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554582 4873 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/537c2ac8-0912-4609-ab4e-760060a78d52-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554591 4873 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554600 4873 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.554607 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/537c2ac8-0912-4609-ab4e-760060a78d52-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.850760 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" event={"ID":"537c2ac8-0912-4609-ab4e-760060a78d52","Type":"ContainerDied","Data":"19b267f7d9414ea523f984cf39b74596cada12d50893b7c0d4c30582756f8797"} Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.850824 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b267f7d9414ea523f984cf39b74596cada12d50893b7c0d4c30582756f8797" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.850829 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.985814 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c"] Feb 19 10:16:33 crc kubenswrapper[4873]: E0219 10:16:33.986214 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537c2ac8-0912-4609-ab4e-760060a78d52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.986227 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="537c2ac8-0912-4609-ab4e-760060a78d52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 10:16:33 crc kubenswrapper[4873]: E0219 10:16:33.986241 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3655910-cb6e-4b54-bd68-48c5ba1551df" containerName="registry-server" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.986248 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3655910-cb6e-4b54-bd68-48c5ba1551df" containerName="registry-server" Feb 19 10:16:33 crc kubenswrapper[4873]: E0219 10:16:33.986275 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3655910-cb6e-4b54-bd68-48c5ba1551df" containerName="extract-content" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.986280 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3655910-cb6e-4b54-bd68-48c5ba1551df" containerName="extract-content" Feb 19 10:16:33 crc kubenswrapper[4873]: E0219 10:16:33.986295 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3655910-cb6e-4b54-bd68-48c5ba1551df" containerName="extract-utilities" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.986300 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3655910-cb6e-4b54-bd68-48c5ba1551df" containerName="extract-utilities" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.986477 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="537c2ac8-0912-4609-ab4e-760060a78d52" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.986486 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3655910-cb6e-4b54-bd68-48c5ba1551df" containerName="registry-server" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.987091 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.994903 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.996298 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.996461 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.996636 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:16:33 crc kubenswrapper[4873]: I0219 10:16:33.998947 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.020279 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c"] Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.067358 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.067436 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbt8b\" (UniqueName: \"kubernetes.io/projected/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-kube-api-access-tbt8b\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.067546 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.067770 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.067929 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.170022 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.170090 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.170180 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.170210 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbt8b\" (UniqueName: \"kubernetes.io/projected/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-kube-api-access-tbt8b\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.170256 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.171308 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.175035 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.175173 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.177463 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.187801 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbt8b\" (UniqueName: \"kubernetes.io/projected/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-kube-api-access-tbt8b\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dks5c\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.354589 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:16:34 crc kubenswrapper[4873]: I0219 10:16:34.906299 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c"] Feb 19 10:16:35 crc kubenswrapper[4873]: I0219 10:16:35.871169 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" event={"ID":"f5d576b5-56dd-4f9f-b67b-0ee87213ea78","Type":"ContainerStarted","Data":"c7e60ef35ab043c3ac745ac0132cf5af5a966586f74c638682238d470e1f2abe"} Feb 19 10:16:35 crc kubenswrapper[4873]: I0219 10:16:35.871223 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" event={"ID":"f5d576b5-56dd-4f9f-b67b-0ee87213ea78","Type":"ContainerStarted","Data":"bd2f1c663d883cba02fe468ae90a655045cbc334efcc127e3604a7513093b1d3"} Feb 19 10:16:35 crc kubenswrapper[4873]: I0219 10:16:35.899114 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" podStartSLOduration=2.459322563 podStartE2EDuration="2.899084869s" podCreationTimestamp="2026-02-19 10:16:33 +0000 UTC" firstStartedPulling="2026-02-19 10:16:34.910580113 +0000 UTC m=+1904.200011751" lastFinishedPulling="2026-02-19 10:16:35.350342419 +0000 UTC m=+1904.639774057" observedRunningTime="2026-02-19 10:16:35.891434657 +0000 UTC m=+1905.180866295" watchObservedRunningTime="2026-02-19 10:16:35.899084869 +0000 UTC m=+1905.188516497" Feb 19 10:17:45 crc kubenswrapper[4873]: I0219 10:17:45.513616 4873 generic.go:334] "Generic (PLEG): container finished" podID="f5d576b5-56dd-4f9f-b67b-0ee87213ea78" containerID="c7e60ef35ab043c3ac745ac0132cf5af5a966586f74c638682238d470e1f2abe" exitCode=0 Feb 19 10:17:45 crc kubenswrapper[4873]: I0219 10:17:45.513922 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" event={"ID":"f5d576b5-56dd-4f9f-b67b-0ee87213ea78","Type":"ContainerDied","Data":"c7e60ef35ab043c3ac745ac0132cf5af5a966586f74c638682238d470e1f2abe"} Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.009378 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.189375 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ssh-key-openstack-edpm-ipam\") pod \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.189500 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbt8b\" (UniqueName: \"kubernetes.io/projected/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-kube-api-access-tbt8b\") pod \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.189586 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovncontroller-config-0\") pod \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.189620 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-inventory\") pod \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.189717 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovn-combined-ca-bundle\") pod \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\" (UID: \"f5d576b5-56dd-4f9f-b67b-0ee87213ea78\") " Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.196370 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f5d576b5-56dd-4f9f-b67b-0ee87213ea78" (UID: "f5d576b5-56dd-4f9f-b67b-0ee87213ea78"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.196448 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-kube-api-access-tbt8b" (OuterVolumeSpecName: "kube-api-access-tbt8b") pod "f5d576b5-56dd-4f9f-b67b-0ee87213ea78" (UID: "f5d576b5-56dd-4f9f-b67b-0ee87213ea78"). InnerVolumeSpecName "kube-api-access-tbt8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.221689 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f5d576b5-56dd-4f9f-b67b-0ee87213ea78" (UID: "f5d576b5-56dd-4f9f-b67b-0ee87213ea78"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.222084 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-inventory" (OuterVolumeSpecName: "inventory") pod "f5d576b5-56dd-4f9f-b67b-0ee87213ea78" (UID: "f5d576b5-56dd-4f9f-b67b-0ee87213ea78"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.222328 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "f5d576b5-56dd-4f9f-b67b-0ee87213ea78" (UID: "f5d576b5-56dd-4f9f-b67b-0ee87213ea78"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.292400 4873 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.292431 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.292441 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbt8b\" (UniqueName: \"kubernetes.io/projected/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-kube-api-access-tbt8b\") on node \"crc\" DevicePath \"\"" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.292452 4873 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.292463 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5d576b5-56dd-4f9f-b67b-0ee87213ea78-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.530623 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" event={"ID":"f5d576b5-56dd-4f9f-b67b-0ee87213ea78","Type":"ContainerDied","Data":"bd2f1c663d883cba02fe468ae90a655045cbc334efcc127e3604a7513093b1d3"} Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.530657 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dks5c" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.530671 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd2f1c663d883cba02fe468ae90a655045cbc334efcc127e3604a7513093b1d3" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.627091 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb"] Feb 19 10:17:47 crc kubenswrapper[4873]: E0219 10:17:47.627616 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d576b5-56dd-4f9f-b67b-0ee87213ea78" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.627639 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d576b5-56dd-4f9f-b67b-0ee87213ea78" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.627862 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d576b5-56dd-4f9f-b67b-0ee87213ea78" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.628658 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.631120 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.631346 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.631401 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.631479 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.633086 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.635436 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.660945 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb"] Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.700881 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.700952 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.701029 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl925\" (UniqueName: \"kubernetes.io/projected/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-kube-api-access-rl925\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.701149 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.701245 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.701295 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.802074 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.802151 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.802182 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl925\" (UniqueName: \"kubernetes.io/projected/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-kube-api-access-rl925\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.802223 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.802300 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.802332 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.808467 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.809416 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.810360 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.811240 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.820923 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.822139 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl925\" (UniqueName: \"kubernetes.io/projected/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-kube-api-access-rl925\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:47 crc kubenswrapper[4873]: I0219 10:17:47.957522 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:17:48 crc kubenswrapper[4873]: I0219 10:17:48.240297 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:17:48 crc kubenswrapper[4873]: I0219 10:17:48.240349 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:17:48 crc kubenswrapper[4873]: I0219 10:17:48.478184 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb"] Feb 19 10:17:48 crc kubenswrapper[4873]: I0219 10:17:48.541462 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" event={"ID":"a607f592-ebca-4bf5-9e98-f9e2bc131ff1","Type":"ContainerStarted","Data":"98d6db3ecd221a07d5e3322329f070324703abb980ca281c65ac99b2ade4cb54"} Feb 19 10:17:50 crc kubenswrapper[4873]: I0219 10:17:50.560770 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" event={"ID":"a607f592-ebca-4bf5-9e98-f9e2bc131ff1","Type":"ContainerStarted","Data":"2f4be555673dd1a1b739a8ef2e49bb827fa57d32386544997526fd8f60519a74"} Feb 19 10:17:50 crc kubenswrapper[4873]: I0219 10:17:50.580749 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" podStartSLOduration=2.565319735 podStartE2EDuration="3.580727648s" podCreationTimestamp="2026-02-19 10:17:47 +0000 UTC" firstStartedPulling="2026-02-19 10:17:48.48168526 +0000 UTC m=+1977.771116898" lastFinishedPulling="2026-02-19 10:17:49.497093163 +0000 UTC m=+1978.786524811" observedRunningTime="2026-02-19 10:17:50.575956428 +0000 UTC m=+1979.865388066" watchObservedRunningTime="2026-02-19 10:17:50.580727648 +0000 UTC m=+1979.870159296" Feb 19 10:17:58 crc kubenswrapper[4873]: I0219 10:17:58.874677 4873 scope.go:117] "RemoveContainer" containerID="9d6de769e9d17333501a00980ac56829127a539f53f95bb25cf420c5630db360" Feb 19 10:17:58 crc kubenswrapper[4873]: I0219 10:17:58.897586 4873 scope.go:117] "RemoveContainer" containerID="05b98d967cad1ee13631fa999a1fe6d672e3aca8c59af9321211e972aeea3daf" Feb 19 10:17:58 crc kubenswrapper[4873]: I0219 10:17:58.946154 4873 scope.go:117] "RemoveContainer" containerID="d1ee2b33d1585962e5c2d8f8deb1f53ccd7bfb877b0017578ca6bff8f7dfd26e" Feb 19 10:18:18 crc kubenswrapper[4873]: I0219 10:18:18.240424 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:18:18 crc kubenswrapper[4873]: I0219 10:18:18.240815 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:18:39 crc kubenswrapper[4873]: I0219 10:18:39.023431 4873 generic.go:334] "Generic (PLEG): container finished" podID="a607f592-ebca-4bf5-9e98-f9e2bc131ff1" containerID="2f4be555673dd1a1b739a8ef2e49bb827fa57d32386544997526fd8f60519a74" exitCode=0 Feb 19 10:18:39 crc kubenswrapper[4873]: I0219 10:18:39.024026 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" event={"ID":"a607f592-ebca-4bf5-9e98-f9e2bc131ff1","Type":"ContainerDied","Data":"2f4be555673dd1a1b739a8ef2e49bb827fa57d32386544997526fd8f60519a74"} Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.528607 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.621251 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl925\" (UniqueName: \"kubernetes.io/projected/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-kube-api-access-rl925\") pod \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.621378 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.621415 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-nova-metadata-neutron-config-0\") pod \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.621495 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-ssh-key-openstack-edpm-ipam\") pod \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.621577 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-inventory\") pod \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.621704 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-metadata-combined-ca-bundle\") pod \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\" (UID: \"a607f592-ebca-4bf5-9e98-f9e2bc131ff1\") " Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.628166 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-kube-api-access-rl925" (OuterVolumeSpecName: "kube-api-access-rl925") pod "a607f592-ebca-4bf5-9e98-f9e2bc131ff1" (UID: "a607f592-ebca-4bf5-9e98-f9e2bc131ff1"). InnerVolumeSpecName "kube-api-access-rl925". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.628524 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a607f592-ebca-4bf5-9e98-f9e2bc131ff1" (UID: "a607f592-ebca-4bf5-9e98-f9e2bc131ff1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.653212 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a607f592-ebca-4bf5-9e98-f9e2bc131ff1" (UID: "a607f592-ebca-4bf5-9e98-f9e2bc131ff1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.660045 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a607f592-ebca-4bf5-9e98-f9e2bc131ff1" (UID: "a607f592-ebca-4bf5-9e98-f9e2bc131ff1"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.664923 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a607f592-ebca-4bf5-9e98-f9e2bc131ff1" (UID: "a607f592-ebca-4bf5-9e98-f9e2bc131ff1"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.666535 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-inventory" (OuterVolumeSpecName: "inventory") pod "a607f592-ebca-4bf5-9e98-f9e2bc131ff1" (UID: "a607f592-ebca-4bf5-9e98-f9e2bc131ff1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.723919 4873 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.724003 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl925\" (UniqueName: \"kubernetes.io/projected/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-kube-api-access-rl925\") on node \"crc\" DevicePath \"\"" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.724015 4873 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.724026 4873 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.724034 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:18:40 crc kubenswrapper[4873]: I0219 10:18:40.724044 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a607f592-ebca-4bf5-9e98-f9e2bc131ff1-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.045367 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" event={"ID":"a607f592-ebca-4bf5-9e98-f9e2bc131ff1","Type":"ContainerDied","Data":"98d6db3ecd221a07d5e3322329f070324703abb980ca281c65ac99b2ade4cb54"} Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.045430 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.045442 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d6db3ecd221a07d5e3322329f070324703abb980ca281c65ac99b2ade4cb54" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.161484 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf"] Feb 19 10:18:41 crc kubenswrapper[4873]: E0219 10:18:41.161942 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a607f592-ebca-4bf5-9e98-f9e2bc131ff1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.161961 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a607f592-ebca-4bf5-9e98-f9e2bc131ff1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.162182 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a607f592-ebca-4bf5-9e98-f9e2bc131ff1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.162912 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.165416 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.165440 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.166052 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.166191 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.166888 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.187565 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf"] Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.234210 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.234251 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.234286 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.234340 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.234395 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-667dh\" (UniqueName: \"kubernetes.io/projected/2baa296e-fb37-4d90-a7e4-68f61006e085-kube-api-access-667dh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.336505 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.336632 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.337180 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.337277 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.337407 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-667dh\" (UniqueName: \"kubernetes.io/projected/2baa296e-fb37-4d90-a7e4-68f61006e085-kube-api-access-667dh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.341337 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.342668 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.345559 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.351447 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.363145 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-667dh\" (UniqueName: \"kubernetes.io/projected/2baa296e-fb37-4d90-a7e4-68f61006e085-kube-api-access-667dh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:41 crc kubenswrapper[4873]: I0219 10:18:41.480324 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:18:42 crc kubenswrapper[4873]: I0219 10:18:42.025012 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf"] Feb 19 10:18:42 crc kubenswrapper[4873]: I0219 10:18:42.055481 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" event={"ID":"2baa296e-fb37-4d90-a7e4-68f61006e085","Type":"ContainerStarted","Data":"35511ceb1e2947d3a6dc7c9578f623a90ab43f627e3e44514885076fab05f57a"} Feb 19 10:18:43 crc kubenswrapper[4873]: I0219 10:18:43.068499 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" event={"ID":"2baa296e-fb37-4d90-a7e4-68f61006e085","Type":"ContainerStarted","Data":"150a658f281dc96d724d50ff186b8c8e2240351631746c6a8e775330f61234f8"} Feb 19 10:18:43 crc kubenswrapper[4873]: I0219 10:18:43.092445 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" podStartSLOduration=1.3594201830000001 podStartE2EDuration="2.092417937s" podCreationTimestamp="2026-02-19 10:18:41 +0000 UTC" firstStartedPulling="2026-02-19 10:18:42.034840285 +0000 UTC m=+2031.324271923" lastFinishedPulling="2026-02-19 10:18:42.767838039 +0000 UTC m=+2032.057269677" observedRunningTime="2026-02-19 10:18:43.091228848 +0000 UTC m=+2032.380660526" watchObservedRunningTime="2026-02-19 10:18:43.092417937 +0000 UTC m=+2032.381849585" Feb 19 10:18:48 crc kubenswrapper[4873]: I0219 10:18:48.240878 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:18:48 crc kubenswrapper[4873]: I0219 10:18:48.241617 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:18:48 crc kubenswrapper[4873]: I0219 10:18:48.241709 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:18:48 crc kubenswrapper[4873]: I0219 10:18:48.242923 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ac19c5656812709276b88acb63ac96a06120ceef6f1ea4e7a6c41a75ff13fe5"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:18:48 crc kubenswrapper[4873]: I0219 10:18:48.243043 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://4ac19c5656812709276b88acb63ac96a06120ceef6f1ea4e7a6c41a75ff13fe5" gracePeriod=600 Feb 19 10:18:49 crc kubenswrapper[4873]: I0219 10:18:49.132854 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="4ac19c5656812709276b88acb63ac96a06120ceef6f1ea4e7a6c41a75ff13fe5" exitCode=0 Feb 19 10:18:49 crc kubenswrapper[4873]: I0219 10:18:49.132927 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"4ac19c5656812709276b88acb63ac96a06120ceef6f1ea4e7a6c41a75ff13fe5"} Feb 19 10:18:49 crc kubenswrapper[4873]: I0219 10:18:49.133532 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806"} Feb 19 10:18:49 crc kubenswrapper[4873]: I0219 10:18:49.133561 4873 scope.go:117] "RemoveContainer" containerID="a79f1351fda39d47e0aa6a236ce6d5659d161f6ce80f3eb2a7b73e5f69948790" Feb 19 10:20:48 crc kubenswrapper[4873]: I0219 10:20:48.240546 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:20:48 crc kubenswrapper[4873]: I0219 10:20:48.241054 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.190717 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-khxsg"] Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.204753 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.219077 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khxsg"] Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.392617 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-catalog-content\") pod \"community-operators-khxsg\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.392687 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-utilities\") pod \"community-operators-khxsg\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.392815 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gzjs\" (UniqueName: \"kubernetes.io/projected/f991af75-df09-4a06-bb57-78ab59b5ad7e-kube-api-access-8gzjs\") pod \"community-operators-khxsg\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.494897 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-catalog-content\") pod \"community-operators-khxsg\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.494964 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-utilities\") pod \"community-operators-khxsg\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.495053 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gzjs\" (UniqueName: \"kubernetes.io/projected/f991af75-df09-4a06-bb57-78ab59b5ad7e-kube-api-access-8gzjs\") pod \"community-operators-khxsg\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.495884 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-catalog-content\") pod \"community-operators-khxsg\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.496192 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-utilities\") pod \"community-operators-khxsg\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.514928 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gzjs\" (UniqueName: \"kubernetes.io/projected/f991af75-df09-4a06-bb57-78ab59b5ad7e-kube-api-access-8gzjs\") pod \"community-operators-khxsg\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:08 crc kubenswrapper[4873]: I0219 10:21:08.539956 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:09 crc kubenswrapper[4873]: I0219 10:21:09.094378 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khxsg"] Feb 19 10:21:09 crc kubenswrapper[4873]: I0219 10:21:09.483413 4873 generic.go:334] "Generic (PLEG): container finished" podID="f991af75-df09-4a06-bb57-78ab59b5ad7e" containerID="88564ce2a7c708e33071b6f62b0d16c2bb038e8f5355a93d478d75f1ad7f0155" exitCode=0 Feb 19 10:21:09 crc kubenswrapper[4873]: I0219 10:21:09.485530 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 10:21:09 crc kubenswrapper[4873]: I0219 10:21:09.513549 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khxsg" event={"ID":"f991af75-df09-4a06-bb57-78ab59b5ad7e","Type":"ContainerDied","Data":"88564ce2a7c708e33071b6f62b0d16c2bb038e8f5355a93d478d75f1ad7f0155"} Feb 19 10:21:09 crc kubenswrapper[4873]: I0219 10:21:09.513592 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khxsg" event={"ID":"f991af75-df09-4a06-bb57-78ab59b5ad7e","Type":"ContainerStarted","Data":"9a6281209346e113f5ffeda37b6972b056be50b23a259933d9933156f1474537"} Feb 19 10:21:10 crc kubenswrapper[4873]: I0219 10:21:10.492165 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khxsg" event={"ID":"f991af75-df09-4a06-bb57-78ab59b5ad7e","Type":"ContainerStarted","Data":"a26a20cf4b0b405ee529c00faf971a74104e7443611ca93fcd0adf06c42a09c1"} Feb 19 10:21:11 crc kubenswrapper[4873]: I0219 10:21:11.573354 4873 generic.go:334] "Generic (PLEG): container finished" podID="f991af75-df09-4a06-bb57-78ab59b5ad7e" containerID="a26a20cf4b0b405ee529c00faf971a74104e7443611ca93fcd0adf06c42a09c1" exitCode=0 Feb 19 10:21:11 crc kubenswrapper[4873]: I0219 10:21:11.573709 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khxsg" event={"ID":"f991af75-df09-4a06-bb57-78ab59b5ad7e","Type":"ContainerDied","Data":"a26a20cf4b0b405ee529c00faf971a74104e7443611ca93fcd0adf06c42a09c1"} Feb 19 10:21:12 crc kubenswrapper[4873]: I0219 10:21:12.583473 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khxsg" event={"ID":"f991af75-df09-4a06-bb57-78ab59b5ad7e","Type":"ContainerStarted","Data":"a17f3b6dc772d0a07019802a2bad09bfc1931a1daec0cb2c84a8ce1eca99ee2b"} Feb 19 10:21:12 crc kubenswrapper[4873]: I0219 10:21:12.608710 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-khxsg" podStartSLOduration=1.889963015 podStartE2EDuration="4.608693185s" podCreationTimestamp="2026-02-19 10:21:08 +0000 UTC" firstStartedPulling="2026-02-19 10:21:09.485300379 +0000 UTC m=+2178.774732017" lastFinishedPulling="2026-02-19 10:21:12.204030549 +0000 UTC m=+2181.493462187" observedRunningTime="2026-02-19 10:21:12.602839429 +0000 UTC m=+2181.892271097" watchObservedRunningTime="2026-02-19 10:21:12.608693185 +0000 UTC m=+2181.898124833" Feb 19 10:21:18 crc kubenswrapper[4873]: I0219 10:21:18.240359 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:21:18 crc kubenswrapper[4873]: I0219 10:21:18.240837 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:21:18 crc kubenswrapper[4873]: I0219 10:21:18.541025 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:18 crc kubenswrapper[4873]: I0219 10:21:18.541092 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:18 crc kubenswrapper[4873]: I0219 10:21:18.617048 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:18 crc kubenswrapper[4873]: I0219 10:21:18.711914 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:18 crc kubenswrapper[4873]: I0219 10:21:18.856712 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khxsg"] Feb 19 10:21:20 crc kubenswrapper[4873]: I0219 10:21:20.680790 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-khxsg" podUID="f991af75-df09-4a06-bb57-78ab59b5ad7e" containerName="registry-server" containerID="cri-o://a17f3b6dc772d0a07019802a2bad09bfc1931a1daec0cb2c84a8ce1eca99ee2b" gracePeriod=2 Feb 19 10:21:21 crc kubenswrapper[4873]: I0219 10:21:21.695182 4873 generic.go:334] "Generic (PLEG): container finished" podID="f991af75-df09-4a06-bb57-78ab59b5ad7e" containerID="a17f3b6dc772d0a07019802a2bad09bfc1931a1daec0cb2c84a8ce1eca99ee2b" exitCode=0 Feb 19 10:21:21 crc kubenswrapper[4873]: I0219 10:21:21.695264 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khxsg" event={"ID":"f991af75-df09-4a06-bb57-78ab59b5ad7e","Type":"ContainerDied","Data":"a17f3b6dc772d0a07019802a2bad09bfc1931a1daec0cb2c84a8ce1eca99ee2b"} Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.400737 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.476500 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gzjs\" (UniqueName: \"kubernetes.io/projected/f991af75-df09-4a06-bb57-78ab59b5ad7e-kube-api-access-8gzjs\") pod \"f991af75-df09-4a06-bb57-78ab59b5ad7e\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.476624 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-utilities\") pod \"f991af75-df09-4a06-bb57-78ab59b5ad7e\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.476738 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-catalog-content\") pod \"f991af75-df09-4a06-bb57-78ab59b5ad7e\" (UID: \"f991af75-df09-4a06-bb57-78ab59b5ad7e\") " Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.478524 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-utilities" (OuterVolumeSpecName: "utilities") pod "f991af75-df09-4a06-bb57-78ab59b5ad7e" (UID: "f991af75-df09-4a06-bb57-78ab59b5ad7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.486656 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f991af75-df09-4a06-bb57-78ab59b5ad7e-kube-api-access-8gzjs" (OuterVolumeSpecName: "kube-api-access-8gzjs") pod "f991af75-df09-4a06-bb57-78ab59b5ad7e" (UID: "f991af75-df09-4a06-bb57-78ab59b5ad7e"). InnerVolumeSpecName "kube-api-access-8gzjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.530987 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f991af75-df09-4a06-bb57-78ab59b5ad7e" (UID: "f991af75-df09-4a06-bb57-78ab59b5ad7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.578843 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gzjs\" (UniqueName: \"kubernetes.io/projected/f991af75-df09-4a06-bb57-78ab59b5ad7e-kube-api-access-8gzjs\") on node \"crc\" DevicePath \"\"" Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.578879 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.578893 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f991af75-df09-4a06-bb57-78ab59b5ad7e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.704858 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khxsg" event={"ID":"f991af75-df09-4a06-bb57-78ab59b5ad7e","Type":"ContainerDied","Data":"9a6281209346e113f5ffeda37b6972b056be50b23a259933d9933156f1474537"} Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.704913 4873 scope.go:117] "RemoveContainer" containerID="a17f3b6dc772d0a07019802a2bad09bfc1931a1daec0cb2c84a8ce1eca99ee2b" Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.705041 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khxsg" Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.740296 4873 scope.go:117] "RemoveContainer" containerID="a26a20cf4b0b405ee529c00faf971a74104e7443611ca93fcd0adf06c42a09c1" Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.743714 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khxsg"] Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.752926 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-khxsg"] Feb 19 10:21:22 crc kubenswrapper[4873]: I0219 10:21:22.769582 4873 scope.go:117] "RemoveContainer" containerID="88564ce2a7c708e33071b6f62b0d16c2bb038e8f5355a93d478d75f1ad7f0155" Feb 19 10:21:23 crc kubenswrapper[4873]: I0219 10:21:23.495591 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f991af75-df09-4a06-bb57-78ab59b5ad7e" path="/var/lib/kubelet/pods/f991af75-df09-4a06-bb57-78ab59b5ad7e/volumes" Feb 19 10:21:48 crc kubenswrapper[4873]: I0219 10:21:48.240011 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:21:48 crc kubenswrapper[4873]: I0219 10:21:48.240659 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:21:48 crc kubenswrapper[4873]: I0219 10:21:48.240713 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:21:48 crc kubenswrapper[4873]: I0219 10:21:48.241592 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:21:48 crc kubenswrapper[4873]: I0219 10:21:48.241662 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" gracePeriod=600 Feb 19 10:21:48 crc kubenswrapper[4873]: E0219 10:21:48.363126 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:21:48 crc kubenswrapper[4873]: I0219 10:21:48.946424 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" exitCode=0 Feb 19 10:21:48 crc kubenswrapper[4873]: I0219 10:21:48.946497 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806"} Feb 19 10:21:48 crc kubenswrapper[4873]: I0219 10:21:48.946548 4873 scope.go:117] "RemoveContainer" containerID="4ac19c5656812709276b88acb63ac96a06120ceef6f1ea4e7a6c41a75ff13fe5" Feb 19 10:21:48 crc kubenswrapper[4873]: I0219 10:21:48.947595 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:21:48 crc kubenswrapper[4873]: E0219 10:21:48.948144 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:22:01 crc kubenswrapper[4873]: I0219 10:22:01.494496 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:22:01 crc kubenswrapper[4873]: E0219 10:22:01.495712 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:22:15 crc kubenswrapper[4873]: I0219 10:22:15.484730 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:22:15 crc kubenswrapper[4873]: E0219 10:22:15.485540 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.076142 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jt52q"] Feb 19 10:22:22 crc kubenswrapper[4873]: E0219 10:22:22.077490 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f991af75-df09-4a06-bb57-78ab59b5ad7e" containerName="extract-content" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.077516 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f991af75-df09-4a06-bb57-78ab59b5ad7e" containerName="extract-content" Feb 19 10:22:22 crc kubenswrapper[4873]: E0219 10:22:22.077537 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f991af75-df09-4a06-bb57-78ab59b5ad7e" containerName="extract-utilities" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.077549 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f991af75-df09-4a06-bb57-78ab59b5ad7e" containerName="extract-utilities" Feb 19 10:22:22 crc kubenswrapper[4873]: E0219 10:22:22.077576 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f991af75-df09-4a06-bb57-78ab59b5ad7e" containerName="registry-server" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.077588 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f991af75-df09-4a06-bb57-78ab59b5ad7e" containerName="registry-server" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.077914 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f991af75-df09-4a06-bb57-78ab59b5ad7e" containerName="registry-server" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.080158 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.089279 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jt52q"] Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.168230 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-catalog-content\") pod \"redhat-marketplace-jt52q\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.168310 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-utilities\") pod \"redhat-marketplace-jt52q\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.168901 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrvp7\" (UniqueName: \"kubernetes.io/projected/9566d3f3-4b33-430a-9a8c-a32b3425b487-kube-api-access-rrvp7\") pod \"redhat-marketplace-jt52q\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.270932 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-catalog-content\") pod \"redhat-marketplace-jt52q\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.270990 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-utilities\") pod \"redhat-marketplace-jt52q\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.271030 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrvp7\" (UniqueName: \"kubernetes.io/projected/9566d3f3-4b33-430a-9a8c-a32b3425b487-kube-api-access-rrvp7\") pod \"redhat-marketplace-jt52q\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.271582 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-utilities\") pod \"redhat-marketplace-jt52q\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.271743 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-catalog-content\") pod \"redhat-marketplace-jt52q\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.292611 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrvp7\" (UniqueName: \"kubernetes.io/projected/9566d3f3-4b33-430a-9a8c-a32b3425b487-kube-api-access-rrvp7\") pod \"redhat-marketplace-jt52q\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.439180 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.673370 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j4bwz"] Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.678759 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.686046 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j4bwz"] Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.784384 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-catalog-content\") pod \"redhat-operators-j4bwz\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.784529 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4fg2\" (UniqueName: \"kubernetes.io/projected/687dca98-34c6-47e6-a7cf-89bf448a3426-kube-api-access-w4fg2\") pod \"redhat-operators-j4bwz\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.784944 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-utilities\") pod \"redhat-operators-j4bwz\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.886575 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-catalog-content\") pod \"redhat-operators-j4bwz\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.886667 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4fg2\" (UniqueName: \"kubernetes.io/projected/687dca98-34c6-47e6-a7cf-89bf448a3426-kube-api-access-w4fg2\") pod \"redhat-operators-j4bwz\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.886822 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-utilities\") pod \"redhat-operators-j4bwz\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.887160 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-catalog-content\") pod \"redhat-operators-j4bwz\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.887303 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-utilities\") pod \"redhat-operators-j4bwz\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.912687 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4fg2\" (UniqueName: \"kubernetes.io/projected/687dca98-34c6-47e6-a7cf-89bf448a3426-kube-api-access-w4fg2\") pod \"redhat-operators-j4bwz\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:22 crc kubenswrapper[4873]: I0219 10:22:22.961370 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jt52q"] Feb 19 10:22:23 crc kubenswrapper[4873]: I0219 10:22:22.999984 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:23 crc kubenswrapper[4873]: I0219 10:22:23.286704 4873 generic.go:334] "Generic (PLEG): container finished" podID="9566d3f3-4b33-430a-9a8c-a32b3425b487" containerID="510653dd0fd33a1ceeb0daa0d0a2e54a7b51b7404cbad36dc2f8b57117f6c482" exitCode=0 Feb 19 10:22:23 crc kubenswrapper[4873]: I0219 10:22:23.286956 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jt52q" event={"ID":"9566d3f3-4b33-430a-9a8c-a32b3425b487","Type":"ContainerDied","Data":"510653dd0fd33a1ceeb0daa0d0a2e54a7b51b7404cbad36dc2f8b57117f6c482"} Feb 19 10:22:23 crc kubenswrapper[4873]: I0219 10:22:23.286979 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jt52q" event={"ID":"9566d3f3-4b33-430a-9a8c-a32b3425b487","Type":"ContainerStarted","Data":"71ec27bcfbbc970191c06674be63b5836ab590c5cabc97f66738bf2c975611ef"} Feb 19 10:22:23 crc kubenswrapper[4873]: I0219 10:22:23.459583 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j4bwz"] Feb 19 10:22:23 crc kubenswrapper[4873]: W0219 10:22:23.460172 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687dca98_34c6_47e6_a7cf_89bf448a3426.slice/crio-5809c17a99debc08e53af7053478d25e34b6c3d1eac2f1472718d828db0dea27 WatchSource:0}: Error finding container 5809c17a99debc08e53af7053478d25e34b6c3d1eac2f1472718d828db0dea27: Status 404 returned error can't find the container with id 5809c17a99debc08e53af7053478d25e34b6c3d1eac2f1472718d828db0dea27 Feb 19 10:22:24 crc kubenswrapper[4873]: I0219 10:22:24.296589 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jt52q" event={"ID":"9566d3f3-4b33-430a-9a8c-a32b3425b487","Type":"ContainerStarted","Data":"94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192"} Feb 19 10:22:24 crc kubenswrapper[4873]: I0219 10:22:24.298219 4873 generic.go:334] "Generic (PLEG): container finished" podID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerID="648394c940eca417606a1205ea47146261640581b2a250bad362a4285f73a1b2" exitCode=0 Feb 19 10:22:24 crc kubenswrapper[4873]: I0219 10:22:24.298270 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4bwz" event={"ID":"687dca98-34c6-47e6-a7cf-89bf448a3426","Type":"ContainerDied","Data":"648394c940eca417606a1205ea47146261640581b2a250bad362a4285f73a1b2"} Feb 19 10:22:24 crc kubenswrapper[4873]: I0219 10:22:24.298301 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4bwz" event={"ID":"687dca98-34c6-47e6-a7cf-89bf448a3426","Type":"ContainerStarted","Data":"5809c17a99debc08e53af7053478d25e34b6c3d1eac2f1472718d828db0dea27"} Feb 19 10:22:25 crc kubenswrapper[4873]: I0219 10:22:25.308129 4873 generic.go:334] "Generic (PLEG): container finished" podID="9566d3f3-4b33-430a-9a8c-a32b3425b487" containerID="94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192" exitCode=0 Feb 19 10:22:25 crc kubenswrapper[4873]: I0219 10:22:25.308383 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jt52q" event={"ID":"9566d3f3-4b33-430a-9a8c-a32b3425b487","Type":"ContainerDied","Data":"94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192"} Feb 19 10:22:26 crc kubenswrapper[4873]: I0219 10:22:26.326683 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4bwz" event={"ID":"687dca98-34c6-47e6-a7cf-89bf448a3426","Type":"ContainerStarted","Data":"372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1"} Feb 19 10:22:26 crc kubenswrapper[4873]: E0219 10:22:26.980698 4873 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687dca98_34c6_47e6_a7cf_89bf448a3426.slice/crio-conmon-372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1.scope\": RecentStats: unable to find data in memory cache]" Feb 19 10:22:27 crc kubenswrapper[4873]: I0219 10:22:27.337430 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jt52q" event={"ID":"9566d3f3-4b33-430a-9a8c-a32b3425b487","Type":"ContainerStarted","Data":"5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20"} Feb 19 10:22:27 crc kubenswrapper[4873]: I0219 10:22:27.340232 4873 generic.go:334] "Generic (PLEG): container finished" podID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerID="372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1" exitCode=0 Feb 19 10:22:27 crc kubenswrapper[4873]: I0219 10:22:27.340268 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4bwz" event={"ID":"687dca98-34c6-47e6-a7cf-89bf448a3426","Type":"ContainerDied","Data":"372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1"} Feb 19 10:22:27 crc kubenswrapper[4873]: I0219 10:22:27.362720 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jt52q" podStartSLOduration=2.412323232 podStartE2EDuration="5.362701752s" podCreationTimestamp="2026-02-19 10:22:22 +0000 UTC" firstStartedPulling="2026-02-19 10:22:23.29033891 +0000 UTC m=+2252.579770548" lastFinishedPulling="2026-02-19 10:22:26.24071743 +0000 UTC m=+2255.530149068" observedRunningTime="2026-02-19 10:22:27.355461592 +0000 UTC m=+2256.644893260" watchObservedRunningTime="2026-02-19 10:22:27.362701752 +0000 UTC m=+2256.652133390" Feb 19 10:22:28 crc kubenswrapper[4873]: I0219 10:22:28.351797 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4bwz" event={"ID":"687dca98-34c6-47e6-a7cf-89bf448a3426","Type":"ContainerStarted","Data":"45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc"} Feb 19 10:22:30 crc kubenswrapper[4873]: I0219 10:22:30.485146 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:22:30 crc kubenswrapper[4873]: E0219 10:22:30.485819 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:22:32 crc kubenswrapper[4873]: I0219 10:22:32.440271 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:32 crc kubenswrapper[4873]: I0219 10:22:32.440310 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:32 crc kubenswrapper[4873]: I0219 10:22:32.482660 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:32 crc kubenswrapper[4873]: I0219 10:22:32.501652 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j4bwz" podStartSLOduration=7.004408902 podStartE2EDuration="10.501634455s" podCreationTimestamp="2026-02-19 10:22:22 +0000 UTC" firstStartedPulling="2026-02-19 10:22:24.299471047 +0000 UTC m=+2253.588902685" lastFinishedPulling="2026-02-19 10:22:27.79669659 +0000 UTC m=+2257.086128238" observedRunningTime="2026-02-19 10:22:28.374917737 +0000 UTC m=+2257.664349395" watchObservedRunningTime="2026-02-19 10:22:32.501634455 +0000 UTC m=+2261.791066093" Feb 19 10:22:33 crc kubenswrapper[4873]: I0219 10:22:33.000086 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:33 crc kubenswrapper[4873]: I0219 10:22:33.000169 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:33 crc kubenswrapper[4873]: I0219 10:22:33.442940 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:33 crc kubenswrapper[4873]: I0219 10:22:33.722545 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jt52q"] Feb 19 10:22:34 crc kubenswrapper[4873]: I0219 10:22:34.046793 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j4bwz" podUID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerName="registry-server" probeResult="failure" output=< Feb 19 10:22:34 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 10:22:34 crc kubenswrapper[4873]: > Feb 19 10:22:35 crc kubenswrapper[4873]: I0219 10:22:35.412864 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jt52q" podUID="9566d3f3-4b33-430a-9a8c-a32b3425b487" containerName="registry-server" containerID="cri-o://5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20" gracePeriod=2 Feb 19 10:22:35 crc kubenswrapper[4873]: I0219 10:22:35.869039 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:35 crc kubenswrapper[4873]: I0219 10:22:35.957210 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-catalog-content\") pod \"9566d3f3-4b33-430a-9a8c-a32b3425b487\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " Feb 19 10:22:35 crc kubenswrapper[4873]: I0219 10:22:35.957339 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrvp7\" (UniqueName: \"kubernetes.io/projected/9566d3f3-4b33-430a-9a8c-a32b3425b487-kube-api-access-rrvp7\") pod \"9566d3f3-4b33-430a-9a8c-a32b3425b487\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " Feb 19 10:22:35 crc kubenswrapper[4873]: I0219 10:22:35.957435 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-utilities\") pod \"9566d3f3-4b33-430a-9a8c-a32b3425b487\" (UID: \"9566d3f3-4b33-430a-9a8c-a32b3425b487\") " Feb 19 10:22:35 crc kubenswrapper[4873]: I0219 10:22:35.958312 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-utilities" (OuterVolumeSpecName: "utilities") pod "9566d3f3-4b33-430a-9a8c-a32b3425b487" (UID: "9566d3f3-4b33-430a-9a8c-a32b3425b487"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:22:35 crc kubenswrapper[4873]: I0219 10:22:35.963567 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9566d3f3-4b33-430a-9a8c-a32b3425b487-kube-api-access-rrvp7" (OuterVolumeSpecName: "kube-api-access-rrvp7") pod "9566d3f3-4b33-430a-9a8c-a32b3425b487" (UID: "9566d3f3-4b33-430a-9a8c-a32b3425b487"). InnerVolumeSpecName "kube-api-access-rrvp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:22:35 crc kubenswrapper[4873]: I0219 10:22:35.980454 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9566d3f3-4b33-430a-9a8c-a32b3425b487" (UID: "9566d3f3-4b33-430a-9a8c-a32b3425b487"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.059475 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.059515 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrvp7\" (UniqueName: \"kubernetes.io/projected/9566d3f3-4b33-430a-9a8c-a32b3425b487-kube-api-access-rrvp7\") on node \"crc\" DevicePath \"\"" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.059547 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9566d3f3-4b33-430a-9a8c-a32b3425b487-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.424933 4873 generic.go:334] "Generic (PLEG): container finished" podID="9566d3f3-4b33-430a-9a8c-a32b3425b487" containerID="5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20" exitCode=0 Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.425012 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jt52q" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.425008 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jt52q" event={"ID":"9566d3f3-4b33-430a-9a8c-a32b3425b487","Type":"ContainerDied","Data":"5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20"} Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.425699 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jt52q" event={"ID":"9566d3f3-4b33-430a-9a8c-a32b3425b487","Type":"ContainerDied","Data":"71ec27bcfbbc970191c06674be63b5836ab590c5cabc97f66738bf2c975611ef"} Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.425747 4873 scope.go:117] "RemoveContainer" containerID="5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.453767 4873 scope.go:117] "RemoveContainer" containerID="94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.468766 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jt52q"] Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.476769 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jt52q"] Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.495942 4873 scope.go:117] "RemoveContainer" containerID="510653dd0fd33a1ceeb0daa0d0a2e54a7b51b7404cbad36dc2f8b57117f6c482" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.521667 4873 scope.go:117] "RemoveContainer" containerID="5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20" Feb 19 10:22:36 crc kubenswrapper[4873]: E0219 10:22:36.522275 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20\": container with ID starting with 5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20 not found: ID does not exist" containerID="5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.522339 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20"} err="failed to get container status \"5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20\": rpc error: code = NotFound desc = could not find container \"5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20\": container with ID starting with 5bbae291b4d7ddf6b641be79b8adb324a29804d4ff6e6a62036b7fe076225f20 not found: ID does not exist" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.522382 4873 scope.go:117] "RemoveContainer" containerID="94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192" Feb 19 10:22:36 crc kubenswrapper[4873]: E0219 10:22:36.522877 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192\": container with ID starting with 94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192 not found: ID does not exist" containerID="94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.522930 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192"} err="failed to get container status \"94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192\": rpc error: code = NotFound desc = could not find container \"94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192\": container with ID starting with 94b6f685f6f4cc69951d4449012cc4b5ef34653b8f05ea8a0f34417d235c7192 not found: ID does not exist" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.522967 4873 scope.go:117] "RemoveContainer" containerID="510653dd0fd33a1ceeb0daa0d0a2e54a7b51b7404cbad36dc2f8b57117f6c482" Feb 19 10:22:36 crc kubenswrapper[4873]: E0219 10:22:36.523241 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"510653dd0fd33a1ceeb0daa0d0a2e54a7b51b7404cbad36dc2f8b57117f6c482\": container with ID starting with 510653dd0fd33a1ceeb0daa0d0a2e54a7b51b7404cbad36dc2f8b57117f6c482 not found: ID does not exist" containerID="510653dd0fd33a1ceeb0daa0d0a2e54a7b51b7404cbad36dc2f8b57117f6c482" Feb 19 10:22:36 crc kubenswrapper[4873]: I0219 10:22:36.523280 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"510653dd0fd33a1ceeb0daa0d0a2e54a7b51b7404cbad36dc2f8b57117f6c482"} err="failed to get container status \"510653dd0fd33a1ceeb0daa0d0a2e54a7b51b7404cbad36dc2f8b57117f6c482\": rpc error: code = NotFound desc = could not find container \"510653dd0fd33a1ceeb0daa0d0a2e54a7b51b7404cbad36dc2f8b57117f6c482\": container with ID starting with 510653dd0fd33a1ceeb0daa0d0a2e54a7b51b7404cbad36dc2f8b57117f6c482 not found: ID does not exist" Feb 19 10:22:37 crc kubenswrapper[4873]: I0219 10:22:37.495651 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9566d3f3-4b33-430a-9a8c-a32b3425b487" path="/var/lib/kubelet/pods/9566d3f3-4b33-430a-9a8c-a32b3425b487/volumes" Feb 19 10:22:40 crc kubenswrapper[4873]: I0219 10:22:40.479557 4873 generic.go:334] "Generic (PLEG): container finished" podID="2baa296e-fb37-4d90-a7e4-68f61006e085" containerID="150a658f281dc96d724d50ff186b8c8e2240351631746c6a8e775330f61234f8" exitCode=0 Feb 19 10:22:40 crc kubenswrapper[4873]: I0219 10:22:40.479640 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" event={"ID":"2baa296e-fb37-4d90-a7e4-68f61006e085","Type":"ContainerDied","Data":"150a658f281dc96d724d50ff186b8c8e2240351631746c6a8e775330f61234f8"} Feb 19 10:22:41 crc kubenswrapper[4873]: I0219 10:22:41.900883 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:22:41 crc kubenswrapper[4873]: I0219 10:22:41.975456 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-inventory\") pod \"2baa296e-fb37-4d90-a7e4-68f61006e085\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " Feb 19 10:22:41 crc kubenswrapper[4873]: I0219 10:22:41.975592 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-combined-ca-bundle\") pod \"2baa296e-fb37-4d90-a7e4-68f61006e085\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " Feb 19 10:22:41 crc kubenswrapper[4873]: I0219 10:22:41.975652 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-secret-0\") pod \"2baa296e-fb37-4d90-a7e4-68f61006e085\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " Feb 19 10:22:41 crc kubenswrapper[4873]: I0219 10:22:41.975671 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-667dh\" (UniqueName: \"kubernetes.io/projected/2baa296e-fb37-4d90-a7e4-68f61006e085-kube-api-access-667dh\") pod \"2baa296e-fb37-4d90-a7e4-68f61006e085\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " Feb 19 10:22:41 crc kubenswrapper[4873]: I0219 10:22:41.976226 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-ssh-key-openstack-edpm-ipam\") pod \"2baa296e-fb37-4d90-a7e4-68f61006e085\" (UID: \"2baa296e-fb37-4d90-a7e4-68f61006e085\") " Feb 19 10:22:41 crc kubenswrapper[4873]: I0219 10:22:41.980588 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2baa296e-fb37-4d90-a7e4-68f61006e085-kube-api-access-667dh" (OuterVolumeSpecName: "kube-api-access-667dh") pod "2baa296e-fb37-4d90-a7e4-68f61006e085" (UID: "2baa296e-fb37-4d90-a7e4-68f61006e085"). InnerVolumeSpecName "kube-api-access-667dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:22:41 crc kubenswrapper[4873]: I0219 10:22:41.981258 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2baa296e-fb37-4d90-a7e4-68f61006e085" (UID: "2baa296e-fb37-4d90-a7e4-68f61006e085"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.002998 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-inventory" (OuterVolumeSpecName: "inventory") pod "2baa296e-fb37-4d90-a7e4-68f61006e085" (UID: "2baa296e-fb37-4d90-a7e4-68f61006e085"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.006274 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2baa296e-fb37-4d90-a7e4-68f61006e085" (UID: "2baa296e-fb37-4d90-a7e4-68f61006e085"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.027110 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2baa296e-fb37-4d90-a7e4-68f61006e085" (UID: "2baa296e-fb37-4d90-a7e4-68f61006e085"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.078894 4873 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.078925 4873 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.078934 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-667dh\" (UniqueName: \"kubernetes.io/projected/2baa296e-fb37-4d90-a7e4-68f61006e085-kube-api-access-667dh\") on node \"crc\" DevicePath \"\"" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.078942 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.078951 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2baa296e-fb37-4d90-a7e4-68f61006e085-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.510913 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" event={"ID":"2baa296e-fb37-4d90-a7e4-68f61006e085","Type":"ContainerDied","Data":"35511ceb1e2947d3a6dc7c9578f623a90ab43f627e3e44514885076fab05f57a"} Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.510967 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35511ceb1e2947d3a6dc7c9578f623a90ab43f627e3e44514885076fab05f57a" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.511051 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.633392 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6"] Feb 19 10:22:42 crc kubenswrapper[4873]: E0219 10:22:42.633863 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9566d3f3-4b33-430a-9a8c-a32b3425b487" containerName="extract-content" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.633884 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9566d3f3-4b33-430a-9a8c-a32b3425b487" containerName="extract-content" Feb 19 10:22:42 crc kubenswrapper[4873]: E0219 10:22:42.633901 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2baa296e-fb37-4d90-a7e4-68f61006e085" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.633909 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baa296e-fb37-4d90-a7e4-68f61006e085" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 10:22:42 crc kubenswrapper[4873]: E0219 10:22:42.633937 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9566d3f3-4b33-430a-9a8c-a32b3425b487" containerName="extract-utilities" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.633945 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9566d3f3-4b33-430a-9a8c-a32b3425b487" containerName="extract-utilities" Feb 19 10:22:42 crc kubenswrapper[4873]: E0219 10:22:42.633955 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9566d3f3-4b33-430a-9a8c-a32b3425b487" containerName="registry-server" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.633963 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9566d3f3-4b33-430a-9a8c-a32b3425b487" containerName="registry-server" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.634217 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="2baa296e-fb37-4d90-a7e4-68f61006e085" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.634244 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="9566d3f3-4b33-430a-9a8c-a32b3425b487" containerName="registry-server" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.634987 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.638897 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.640005 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.640195 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.641093 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.641450 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.642004 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.642194 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.656359 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6"] Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.694291 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmvvj\" (UniqueName: \"kubernetes.io/projected/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-kube-api-access-wmvvj\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.694454 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.694483 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.694508 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.694532 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.694602 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.694633 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.694679 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.694754 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.694778 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.694801 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.796626 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.796684 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.796831 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.796860 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.796930 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.796980 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.797053 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.797140 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.797179 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.797224 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.797291 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmvvj\" (UniqueName: \"kubernetes.io/projected/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-kube-api-access-wmvvj\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.798872 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.799947 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.800191 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.803590 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.804796 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.805595 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.805623 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.805983 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.809968 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.819552 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmvvj\" (UniqueName: \"kubernetes.io/projected/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-kube-api-access-wmvvj\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.826484 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v25t6\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:42 crc kubenswrapper[4873]: I0219 10:22:42.973534 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:22:43 crc kubenswrapper[4873]: I0219 10:22:43.046658 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:43 crc kubenswrapper[4873]: I0219 10:22:43.112681 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:43 crc kubenswrapper[4873]: W0219 10:22:43.535208 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce5f426d_554a_469a_be1e_e3e1b9bfa68e.slice/crio-98cf7115a91b048cf40f5fa0771846df571cc77d9386148ce081fbf2082750b7 WatchSource:0}: Error finding container 98cf7115a91b048cf40f5fa0771846df571cc77d9386148ce081fbf2082750b7: Status 404 returned error can't find the container with id 98cf7115a91b048cf40f5fa0771846df571cc77d9386148ce081fbf2082750b7 Feb 19 10:22:43 crc kubenswrapper[4873]: I0219 10:22:43.535777 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6"] Feb 19 10:22:44 crc kubenswrapper[4873]: I0219 10:22:44.531273 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" event={"ID":"ce5f426d-554a-469a-be1e-e3e1b9bfa68e","Type":"ContainerStarted","Data":"7c7950ee14ffefcf75376610caefc5f50e26f12cde49f2346a88b479e29c5643"} Feb 19 10:22:44 crc kubenswrapper[4873]: I0219 10:22:44.531610 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" event={"ID":"ce5f426d-554a-469a-be1e-e3e1b9bfa68e","Type":"ContainerStarted","Data":"98cf7115a91b048cf40f5fa0771846df571cc77d9386148ce081fbf2082750b7"} Feb 19 10:22:44 crc kubenswrapper[4873]: I0219 10:22:44.553518 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" podStartSLOduration=2.111810755 podStartE2EDuration="2.553496848s" podCreationTimestamp="2026-02-19 10:22:42 +0000 UTC" firstStartedPulling="2026-02-19 10:22:43.538172475 +0000 UTC m=+2272.827604123" lastFinishedPulling="2026-02-19 10:22:43.979858568 +0000 UTC m=+2273.269290216" observedRunningTime="2026-02-19 10:22:44.548678688 +0000 UTC m=+2273.838110346" watchObservedRunningTime="2026-02-19 10:22:44.553496848 +0000 UTC m=+2273.842928486" Feb 19 10:22:45 crc kubenswrapper[4873]: I0219 10:22:45.485317 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:22:45 crc kubenswrapper[4873]: E0219 10:22:45.486066 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:22:45 crc kubenswrapper[4873]: I0219 10:22:45.981298 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j4bwz"] Feb 19 10:22:45 crc kubenswrapper[4873]: I0219 10:22:45.981509 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j4bwz" podUID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerName="registry-server" containerID="cri-o://45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc" gracePeriod=2 Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.444990 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.469002 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-catalog-content\") pod \"687dca98-34c6-47e6-a7cf-89bf448a3426\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.469165 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-utilities\") pod \"687dca98-34c6-47e6-a7cf-89bf448a3426\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.469221 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4fg2\" (UniqueName: \"kubernetes.io/projected/687dca98-34c6-47e6-a7cf-89bf448a3426-kube-api-access-w4fg2\") pod \"687dca98-34c6-47e6-a7cf-89bf448a3426\" (UID: \"687dca98-34c6-47e6-a7cf-89bf448a3426\") " Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.470128 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-utilities" (OuterVolumeSpecName: "utilities") pod "687dca98-34c6-47e6-a7cf-89bf448a3426" (UID: "687dca98-34c6-47e6-a7cf-89bf448a3426"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.485337 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/687dca98-34c6-47e6-a7cf-89bf448a3426-kube-api-access-w4fg2" (OuterVolumeSpecName: "kube-api-access-w4fg2") pod "687dca98-34c6-47e6-a7cf-89bf448a3426" (UID: "687dca98-34c6-47e6-a7cf-89bf448a3426"). InnerVolumeSpecName "kube-api-access-w4fg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.548982 4873 generic.go:334] "Generic (PLEG): container finished" podID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerID="45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc" exitCode=0 Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.549031 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4bwz" event={"ID":"687dca98-34c6-47e6-a7cf-89bf448a3426","Type":"ContainerDied","Data":"45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc"} Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.549050 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4bwz" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.549058 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4bwz" event={"ID":"687dca98-34c6-47e6-a7cf-89bf448a3426","Type":"ContainerDied","Data":"5809c17a99debc08e53af7053478d25e34b6c3d1eac2f1472718d828db0dea27"} Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.549080 4873 scope.go:117] "RemoveContainer" containerID="45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.579475 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.579603 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4fg2\" (UniqueName: \"kubernetes.io/projected/687dca98-34c6-47e6-a7cf-89bf448a3426-kube-api-access-w4fg2\") on node \"crc\" DevicePath \"\"" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.588424 4873 scope.go:117] "RemoveContainer" containerID="372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.607453 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "687dca98-34c6-47e6-a7cf-89bf448a3426" (UID: "687dca98-34c6-47e6-a7cf-89bf448a3426"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.614343 4873 scope.go:117] "RemoveContainer" containerID="648394c940eca417606a1205ea47146261640581b2a250bad362a4285f73a1b2" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.677176 4873 scope.go:117] "RemoveContainer" containerID="45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc" Feb 19 10:22:46 crc kubenswrapper[4873]: E0219 10:22:46.677771 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc\": container with ID starting with 45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc not found: ID does not exist" containerID="45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.677811 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc"} err="failed to get container status \"45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc\": rpc error: code = NotFound desc = could not find container \"45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc\": container with ID starting with 45d62d795db1c823d67e6e5582b82865d36d5ea8aab87289e80ac1ab64eab8cc not found: ID does not exist" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.677838 4873 scope.go:117] "RemoveContainer" containerID="372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1" Feb 19 10:22:46 crc kubenswrapper[4873]: E0219 10:22:46.678193 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1\": container with ID starting with 372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1 not found: ID does not exist" containerID="372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.678221 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1"} err="failed to get container status \"372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1\": rpc error: code = NotFound desc = could not find container \"372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1\": container with ID starting with 372cfba496aa8daf99f042c076f07838550f2060d0c2498dd29a0f4a4051bad1 not found: ID does not exist" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.678239 4873 scope.go:117] "RemoveContainer" containerID="648394c940eca417606a1205ea47146261640581b2a250bad362a4285f73a1b2" Feb 19 10:22:46 crc kubenswrapper[4873]: E0219 10:22:46.678559 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648394c940eca417606a1205ea47146261640581b2a250bad362a4285f73a1b2\": container with ID starting with 648394c940eca417606a1205ea47146261640581b2a250bad362a4285f73a1b2 not found: ID does not exist" containerID="648394c940eca417606a1205ea47146261640581b2a250bad362a4285f73a1b2" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.678598 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648394c940eca417606a1205ea47146261640581b2a250bad362a4285f73a1b2"} err="failed to get container status \"648394c940eca417606a1205ea47146261640581b2a250bad362a4285f73a1b2\": rpc error: code = NotFound desc = could not find container \"648394c940eca417606a1205ea47146261640581b2a250bad362a4285f73a1b2\": container with ID starting with 648394c940eca417606a1205ea47146261640581b2a250bad362a4285f73a1b2 not found: ID does not exist" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.680944 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687dca98-34c6-47e6-a7cf-89bf448a3426-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.887884 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j4bwz"] Feb 19 10:22:46 crc kubenswrapper[4873]: I0219 10:22:46.935004 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j4bwz"] Feb 19 10:22:47 crc kubenswrapper[4873]: I0219 10:22:47.505186 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="687dca98-34c6-47e6-a7cf-89bf448a3426" path="/var/lib/kubelet/pods/687dca98-34c6-47e6-a7cf-89bf448a3426/volumes" Feb 19 10:22:58 crc kubenswrapper[4873]: I0219 10:22:58.484558 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:22:58 crc kubenswrapper[4873]: E0219 10:22:58.485461 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:23:10 crc kubenswrapper[4873]: I0219 10:23:10.484525 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:23:10 crc kubenswrapper[4873]: E0219 10:23:10.485260 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:23:25 crc kubenswrapper[4873]: I0219 10:23:25.486063 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:23:25 crc kubenswrapper[4873]: E0219 10:23:25.487040 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:23:38 crc kubenswrapper[4873]: I0219 10:23:38.484805 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:23:38 crc kubenswrapper[4873]: E0219 10:23:38.485493 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:23:52 crc kubenswrapper[4873]: I0219 10:23:52.485453 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:23:52 crc kubenswrapper[4873]: E0219 10:23:52.486175 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:24:04 crc kubenswrapper[4873]: I0219 10:24:04.484684 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:24:04 crc kubenswrapper[4873]: E0219 10:24:04.485487 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:24:16 crc kubenswrapper[4873]: I0219 10:24:16.484236 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:24:16 crc kubenswrapper[4873]: E0219 10:24:16.484935 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:24:27 crc kubenswrapper[4873]: I0219 10:24:27.484490 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:24:27 crc kubenswrapper[4873]: E0219 10:24:27.485408 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:24:40 crc kubenswrapper[4873]: I0219 10:24:40.484187 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:24:40 crc kubenswrapper[4873]: E0219 10:24:40.485167 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:24:54 crc kubenswrapper[4873]: I0219 10:24:54.484062 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:24:54 crc kubenswrapper[4873]: E0219 10:24:54.485503 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:25:09 crc kubenswrapper[4873]: I0219 10:25:09.484338 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:25:09 crc kubenswrapper[4873]: E0219 10:25:09.485300 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:25:13 crc kubenswrapper[4873]: I0219 10:25:13.258524 4873 generic.go:334] "Generic (PLEG): container finished" podID="ce5f426d-554a-469a-be1e-e3e1b9bfa68e" containerID="7c7950ee14ffefcf75376610caefc5f50e26f12cde49f2346a88b479e29c5643" exitCode=0 Feb 19 10:25:13 crc kubenswrapper[4873]: I0219 10:25:13.258632 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" event={"ID":"ce5f426d-554a-469a-be1e-e3e1b9bfa68e","Type":"ContainerDied","Data":"7c7950ee14ffefcf75376610caefc5f50e26f12cde49f2346a88b479e29c5643"} Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.688296 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.745574 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-ssh-key-openstack-edpm-ipam\") pod \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.745701 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-extra-config-0\") pod \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.745720 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-0\") pod \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.745796 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-2\") pod \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.745852 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-1\") pod \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.745873 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-inventory\") pod \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.745894 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmvvj\" (UniqueName: \"kubernetes.io/projected/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-kube-api-access-wmvvj\") pod \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.745924 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-0\") pod \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.745945 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-1\") pod \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.745971 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-3\") pod \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.745988 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-combined-ca-bundle\") pod \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\" (UID: \"ce5f426d-554a-469a-be1e-e3e1b9bfa68e\") " Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.758565 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-kube-api-access-wmvvj" (OuterVolumeSpecName: "kube-api-access-wmvvj") pod "ce5f426d-554a-469a-be1e-e3e1b9bfa68e" (UID: "ce5f426d-554a-469a-be1e-e3e1b9bfa68e"). InnerVolumeSpecName "kube-api-access-wmvvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.767636 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "ce5f426d-554a-469a-be1e-e3e1b9bfa68e" (UID: "ce5f426d-554a-469a-be1e-e3e1b9bfa68e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.771983 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "ce5f426d-554a-469a-be1e-e3e1b9bfa68e" (UID: "ce5f426d-554a-469a-be1e-e3e1b9bfa68e"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.778863 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "ce5f426d-554a-469a-be1e-e3e1b9bfa68e" (UID: "ce5f426d-554a-469a-be1e-e3e1b9bfa68e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.779490 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "ce5f426d-554a-469a-be1e-e3e1b9bfa68e" (UID: "ce5f426d-554a-469a-be1e-e3e1b9bfa68e"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.780735 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "ce5f426d-554a-469a-be1e-e3e1b9bfa68e" (UID: "ce5f426d-554a-469a-be1e-e3e1b9bfa68e"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.781914 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "ce5f426d-554a-469a-be1e-e3e1b9bfa68e" (UID: "ce5f426d-554a-469a-be1e-e3e1b9bfa68e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.785709 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "ce5f426d-554a-469a-be1e-e3e1b9bfa68e" (UID: "ce5f426d-554a-469a-be1e-e3e1b9bfa68e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.785959 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ce5f426d-554a-469a-be1e-e3e1b9bfa68e" (UID: "ce5f426d-554a-469a-be1e-e3e1b9bfa68e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.804197 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "ce5f426d-554a-469a-be1e-e3e1b9bfa68e" (UID: "ce5f426d-554a-469a-be1e-e3e1b9bfa68e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.804243 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-inventory" (OuterVolumeSpecName: "inventory") pod "ce5f426d-554a-469a-be1e-e3e1b9bfa68e" (UID: "ce5f426d-554a-469a-be1e-e3e1b9bfa68e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.848549 4873 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.848591 4873 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.848605 4873 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.848613 4873 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.848626 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.848635 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmvvj\" (UniqueName: \"kubernetes.io/projected/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-kube-api-access-wmvvj\") on node \"crc\" DevicePath \"\"" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.848643 4873 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.848651 4873 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.848660 4873 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.848669 4873 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:25:14 crc kubenswrapper[4873]: I0219 10:25:14.848679 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce5f426d-554a-469a-be1e-e3e1b9bfa68e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.279557 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" event={"ID":"ce5f426d-554a-469a-be1e-e3e1b9bfa68e","Type":"ContainerDied","Data":"98cf7115a91b048cf40f5fa0771846df571cc77d9386148ce081fbf2082750b7"} Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.279602 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98cf7115a91b048cf40f5fa0771846df571cc77d9386148ce081fbf2082750b7" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.279609 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v25t6" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.389132 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz"] Feb 19 10:25:15 crc kubenswrapper[4873]: E0219 10:25:15.390034 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerName="extract-utilities" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.390237 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerName="extract-utilities" Feb 19 10:25:15 crc kubenswrapper[4873]: E0219 10:25:15.390457 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerName="registry-server" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.390601 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerName="registry-server" Feb 19 10:25:15 crc kubenswrapper[4873]: E0219 10:25:15.390749 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5f426d-554a-469a-be1e-e3e1b9bfa68e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.390879 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5f426d-554a-469a-be1e-e3e1b9bfa68e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 10:25:15 crc kubenswrapper[4873]: E0219 10:25:15.391040 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerName="extract-content" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.391221 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerName="extract-content" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.391778 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="687dca98-34c6-47e6-a7cf-89bf448a3426" containerName="registry-server" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.391951 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce5f426d-554a-469a-be1e-e3e1b9bfa68e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.393342 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.396712 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.397762 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.397772 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.401803 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-5l9s5" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.404343 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.419665 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz"] Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.459236 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.459298 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.459353 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.459384 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.459413 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.459499 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.459556 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99k8b\" (UniqueName: \"kubernetes.io/projected/bf143721-2963-4009-8e23-0c283b4a88a3-kube-api-access-99k8b\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.562761 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.564733 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99k8b\" (UniqueName: \"kubernetes.io/projected/bf143721-2963-4009-8e23-0c283b4a88a3-kube-api-access-99k8b\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.565076 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.565248 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.565431 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.565552 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.565667 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.575617 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.579092 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.581924 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.585819 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.587703 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.588757 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.596373 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99k8b\" (UniqueName: \"kubernetes.io/projected/bf143721-2963-4009-8e23-0c283b4a88a3-kube-api-access-99k8b\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:15 crc kubenswrapper[4873]: I0219 10:25:15.751793 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:25:16 crc kubenswrapper[4873]: I0219 10:25:16.285495 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz"] Feb 19 10:25:16 crc kubenswrapper[4873]: W0219 10:25:16.290757 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf143721_2963_4009_8e23_0c283b4a88a3.slice/crio-ec7e10e73d578ccea86cceec7a80c2f08166bb6d93976658c29e6c98ef292cac WatchSource:0}: Error finding container ec7e10e73d578ccea86cceec7a80c2f08166bb6d93976658c29e6c98ef292cac: Status 404 returned error can't find the container with id ec7e10e73d578ccea86cceec7a80c2f08166bb6d93976658c29e6c98ef292cac Feb 19 10:25:17 crc kubenswrapper[4873]: I0219 10:25:17.302179 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" event={"ID":"bf143721-2963-4009-8e23-0c283b4a88a3","Type":"ContainerStarted","Data":"b868117e1afa026d3ebaa8096a793cced60c973ae3cbedf3ed41777195b019e2"} Feb 19 10:25:17 crc kubenswrapper[4873]: I0219 10:25:17.302493 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" event={"ID":"bf143721-2963-4009-8e23-0c283b4a88a3","Type":"ContainerStarted","Data":"ec7e10e73d578ccea86cceec7a80c2f08166bb6d93976658c29e6c98ef292cac"} Feb 19 10:25:17 crc kubenswrapper[4873]: I0219 10:25:17.322666 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" podStartSLOduration=1.9381076849999999 podStartE2EDuration="2.322641456s" podCreationTimestamp="2026-02-19 10:25:15 +0000 UTC" firstStartedPulling="2026-02-19 10:25:16.295370413 +0000 UTC m=+2425.584802051" lastFinishedPulling="2026-02-19 10:25:16.679904194 +0000 UTC m=+2425.969335822" observedRunningTime="2026-02-19 10:25:17.316995654 +0000 UTC m=+2426.606427302" watchObservedRunningTime="2026-02-19 10:25:17.322641456 +0000 UTC m=+2426.612073104" Feb 19 10:25:24 crc kubenswrapper[4873]: I0219 10:25:24.502330 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:25:24 crc kubenswrapper[4873]: E0219 10:25:24.503221 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:25:37 crc kubenswrapper[4873]: I0219 10:25:37.484790 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:25:37 crc kubenswrapper[4873]: E0219 10:25:37.485620 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:25:48 crc kubenswrapper[4873]: I0219 10:25:48.484532 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:25:48 crc kubenswrapper[4873]: E0219 10:25:48.485338 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:26:01 crc kubenswrapper[4873]: I0219 10:26:01.492822 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:26:01 crc kubenswrapper[4873]: E0219 10:26:01.493698 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.247584 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bdcwz"] Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.250754 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.264503 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bdcwz"] Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.446453 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d27fce7f-0ae7-4e22-885f-ad2a398647cc-utilities\") pod \"certified-operators-bdcwz\" (UID: \"d27fce7f-0ae7-4e22-885f-ad2a398647cc\") " pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.446514 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d27fce7f-0ae7-4e22-885f-ad2a398647cc-catalog-content\") pod \"certified-operators-bdcwz\" (UID: \"d27fce7f-0ae7-4e22-885f-ad2a398647cc\") " pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.446586 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrmp7\" (UniqueName: \"kubernetes.io/projected/d27fce7f-0ae7-4e22-885f-ad2a398647cc-kube-api-access-vrmp7\") pod \"certified-operators-bdcwz\" (UID: \"d27fce7f-0ae7-4e22-885f-ad2a398647cc\") " pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.549643 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d27fce7f-0ae7-4e22-885f-ad2a398647cc-utilities\") pod \"certified-operators-bdcwz\" (UID: \"d27fce7f-0ae7-4e22-885f-ad2a398647cc\") " pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.550063 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d27fce7f-0ae7-4e22-885f-ad2a398647cc-catalog-content\") pod \"certified-operators-bdcwz\" (UID: \"d27fce7f-0ae7-4e22-885f-ad2a398647cc\") " pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.550223 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrmp7\" (UniqueName: \"kubernetes.io/projected/d27fce7f-0ae7-4e22-885f-ad2a398647cc-kube-api-access-vrmp7\") pod \"certified-operators-bdcwz\" (UID: \"d27fce7f-0ae7-4e22-885f-ad2a398647cc\") " pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.550139 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d27fce7f-0ae7-4e22-885f-ad2a398647cc-utilities\") pod \"certified-operators-bdcwz\" (UID: \"d27fce7f-0ae7-4e22-885f-ad2a398647cc\") " pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.550500 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d27fce7f-0ae7-4e22-885f-ad2a398647cc-catalog-content\") pod \"certified-operators-bdcwz\" (UID: \"d27fce7f-0ae7-4e22-885f-ad2a398647cc\") " pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.571289 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrmp7\" (UniqueName: \"kubernetes.io/projected/d27fce7f-0ae7-4e22-885f-ad2a398647cc-kube-api-access-vrmp7\") pod \"certified-operators-bdcwz\" (UID: \"d27fce7f-0ae7-4e22-885f-ad2a398647cc\") " pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:09 crc kubenswrapper[4873]: I0219 10:26:09.870287 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:10 crc kubenswrapper[4873]: I0219 10:26:10.347262 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bdcwz"] Feb 19 10:26:10 crc kubenswrapper[4873]: I0219 10:26:10.844839 4873 generic.go:334] "Generic (PLEG): container finished" podID="d27fce7f-0ae7-4e22-885f-ad2a398647cc" containerID="f9af54d31ed7feab3f88e8c48aeaa9ce492f22930075ffd283521574bafef2d8" exitCode=0 Feb 19 10:26:10 crc kubenswrapper[4873]: I0219 10:26:10.844880 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdcwz" event={"ID":"d27fce7f-0ae7-4e22-885f-ad2a398647cc","Type":"ContainerDied","Data":"f9af54d31ed7feab3f88e8c48aeaa9ce492f22930075ffd283521574bafef2d8"} Feb 19 10:26:10 crc kubenswrapper[4873]: I0219 10:26:10.844915 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdcwz" event={"ID":"d27fce7f-0ae7-4e22-885f-ad2a398647cc","Type":"ContainerStarted","Data":"aad2e31bce9aa3c21d2169431db6a2f9414a0e6b52e4e906fafee120cecb785e"} Feb 19 10:26:10 crc kubenswrapper[4873]: I0219 10:26:10.846970 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 10:26:14 crc kubenswrapper[4873]: I0219 10:26:14.484546 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:26:14 crc kubenswrapper[4873]: E0219 10:26:14.485548 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:26:15 crc kubenswrapper[4873]: I0219 10:26:15.885881 4873 generic.go:334] "Generic (PLEG): container finished" podID="d27fce7f-0ae7-4e22-885f-ad2a398647cc" containerID="a119938961505d7dd6e47331558f948cbf13d25c3b4a7ac6dc165923f892fccf" exitCode=0 Feb 19 10:26:15 crc kubenswrapper[4873]: I0219 10:26:15.885952 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdcwz" event={"ID":"d27fce7f-0ae7-4e22-885f-ad2a398647cc","Type":"ContainerDied","Data":"a119938961505d7dd6e47331558f948cbf13d25c3b4a7ac6dc165923f892fccf"} Feb 19 10:26:16 crc kubenswrapper[4873]: I0219 10:26:16.900199 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdcwz" event={"ID":"d27fce7f-0ae7-4e22-885f-ad2a398647cc","Type":"ContainerStarted","Data":"9f573e0fb65ef301f9ba3f550f7ba48b9318be65474d4434ed019f4cf2aab52e"} Feb 19 10:26:16 crc kubenswrapper[4873]: I0219 10:26:16.921764 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bdcwz" podStartSLOduration=2.224853929 podStartE2EDuration="7.921745924s" podCreationTimestamp="2026-02-19 10:26:09 +0000 UTC" firstStartedPulling="2026-02-19 10:26:10.846721188 +0000 UTC m=+2480.136152816" lastFinishedPulling="2026-02-19 10:26:16.543613173 +0000 UTC m=+2485.833044811" observedRunningTime="2026-02-19 10:26:16.918610106 +0000 UTC m=+2486.208041754" watchObservedRunningTime="2026-02-19 10:26:16.921745924 +0000 UTC m=+2486.211177562" Feb 19 10:26:19 crc kubenswrapper[4873]: I0219 10:26:19.871123 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:19 crc kubenswrapper[4873]: I0219 10:26:19.871926 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:19 crc kubenswrapper[4873]: I0219 10:26:19.933513 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:28 crc kubenswrapper[4873]: I0219 10:26:28.484303 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:26:28 crc kubenswrapper[4873]: E0219 10:26:28.484861 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:26:29 crc kubenswrapper[4873]: I0219 10:26:29.923660 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bdcwz" Feb 19 10:26:29 crc kubenswrapper[4873]: I0219 10:26:29.992245 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bdcwz"] Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.036023 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2d4s"] Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.036280 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c2d4s" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" containerName="registry-server" containerID="cri-o://9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a" gracePeriod=2 Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.546653 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.665687 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8j57\" (UniqueName: \"kubernetes.io/projected/92377803-fb7e-42d1-ba93-54235a8f9409-kube-api-access-g8j57\") pod \"92377803-fb7e-42d1-ba93-54235a8f9409\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.666061 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-utilities\") pod \"92377803-fb7e-42d1-ba93-54235a8f9409\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.666123 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-catalog-content\") pod \"92377803-fb7e-42d1-ba93-54235a8f9409\" (UID: \"92377803-fb7e-42d1-ba93-54235a8f9409\") " Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.668315 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-utilities" (OuterVolumeSpecName: "utilities") pod "92377803-fb7e-42d1-ba93-54235a8f9409" (UID: "92377803-fb7e-42d1-ba93-54235a8f9409"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.678302 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92377803-fb7e-42d1-ba93-54235a8f9409-kube-api-access-g8j57" (OuterVolumeSpecName: "kube-api-access-g8j57") pod "92377803-fb7e-42d1-ba93-54235a8f9409" (UID: "92377803-fb7e-42d1-ba93-54235a8f9409"). InnerVolumeSpecName "kube-api-access-g8j57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.768492 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8j57\" (UniqueName: \"kubernetes.io/projected/92377803-fb7e-42d1-ba93-54235a8f9409-kube-api-access-g8j57\") on node \"crc\" DevicePath \"\"" Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.768524 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.774406 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92377803-fb7e-42d1-ba93-54235a8f9409" (UID: "92377803-fb7e-42d1-ba93-54235a8f9409"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:26:30 crc kubenswrapper[4873]: I0219 10:26:30.870048 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92377803-fb7e-42d1-ba93-54235a8f9409-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.034924 4873 generic.go:334] "Generic (PLEG): container finished" podID="92377803-fb7e-42d1-ba93-54235a8f9409" containerID="9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a" exitCode=0 Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.034968 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2d4s" event={"ID":"92377803-fb7e-42d1-ba93-54235a8f9409","Type":"ContainerDied","Data":"9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a"} Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.034995 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2d4s" event={"ID":"92377803-fb7e-42d1-ba93-54235a8f9409","Type":"ContainerDied","Data":"a08dd4d7e39597a08d46eee691df8d3e8119bb68e610a407ed93ede91eb7581e"} Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.035015 4873 scope.go:117] "RemoveContainer" containerID="9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a" Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.035195 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2d4s" Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.067745 4873 scope.go:117] "RemoveContainer" containerID="5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2" Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.080401 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2d4s"] Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.101720 4873 scope.go:117] "RemoveContainer" containerID="09d79d6a44f7ab2bd840bebd67e2c4ff2d2bdef097d6feda8d881b4512fee593" Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.103497 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c2d4s"] Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.153553 4873 scope.go:117] "RemoveContainer" containerID="9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a" Feb 19 10:26:31 crc kubenswrapper[4873]: E0219 10:26:31.155443 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a\": container with ID starting with 9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a not found: ID does not exist" containerID="9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a" Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.155590 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a"} err="failed to get container status \"9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a\": rpc error: code = NotFound desc = could not find container \"9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a\": container with ID starting with 9dcea95636472a4fa99dca5f5648382f340d5a35696bba2dd95d5d4baa05cb5a not found: ID does not exist" Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.155673 4873 scope.go:117] "RemoveContainer" containerID="5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2" Feb 19 10:26:31 crc kubenswrapper[4873]: E0219 10:26:31.156136 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2\": container with ID starting with 5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2 not found: ID does not exist" containerID="5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2" Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.156177 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2"} err="failed to get container status \"5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2\": rpc error: code = NotFound desc = could not find container \"5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2\": container with ID starting with 5485a42ca3dc7f78b9a621c9cc2138b955a8e9f0ec19cb28fe151e0c8ae3a5b2 not found: ID does not exist" Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.156203 4873 scope.go:117] "RemoveContainer" containerID="09d79d6a44f7ab2bd840bebd67e2c4ff2d2bdef097d6feda8d881b4512fee593" Feb 19 10:26:31 crc kubenswrapper[4873]: E0219 10:26:31.156526 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d79d6a44f7ab2bd840bebd67e2c4ff2d2bdef097d6feda8d881b4512fee593\": container with ID starting with 09d79d6a44f7ab2bd840bebd67e2c4ff2d2bdef097d6feda8d881b4512fee593 not found: ID does not exist" containerID="09d79d6a44f7ab2bd840bebd67e2c4ff2d2bdef097d6feda8d881b4512fee593" Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.156624 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d79d6a44f7ab2bd840bebd67e2c4ff2d2bdef097d6feda8d881b4512fee593"} err="failed to get container status \"09d79d6a44f7ab2bd840bebd67e2c4ff2d2bdef097d6feda8d881b4512fee593\": rpc error: code = NotFound desc = could not find container \"09d79d6a44f7ab2bd840bebd67e2c4ff2d2bdef097d6feda8d881b4512fee593\": container with ID starting with 09d79d6a44f7ab2bd840bebd67e2c4ff2d2bdef097d6feda8d881b4512fee593 not found: ID does not exist" Feb 19 10:26:31 crc kubenswrapper[4873]: I0219 10:26:31.497233 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" path="/var/lib/kubelet/pods/92377803-fb7e-42d1-ba93-54235a8f9409/volumes" Feb 19 10:26:43 crc kubenswrapper[4873]: I0219 10:26:43.487311 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:26:43 crc kubenswrapper[4873]: E0219 10:26:43.488475 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:26:55 crc kubenswrapper[4873]: I0219 10:26:55.484239 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:26:56 crc kubenswrapper[4873]: I0219 10:26:56.302243 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"edec24981d97e1beda63a2d9013b9abdb1a1dbeed2c76ab65161659d51d3be20"} Feb 19 10:27:19 crc kubenswrapper[4873]: I0219 10:27:19.517338 4873 generic.go:334] "Generic (PLEG): container finished" podID="bf143721-2963-4009-8e23-0c283b4a88a3" containerID="b868117e1afa026d3ebaa8096a793cced60c973ae3cbedf3ed41777195b019e2" exitCode=0 Feb 19 10:27:19 crc kubenswrapper[4873]: I0219 10:27:19.517426 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" event={"ID":"bf143721-2963-4009-8e23-0c283b4a88a3","Type":"ContainerDied","Data":"b868117e1afa026d3ebaa8096a793cced60c973ae3cbedf3ed41777195b019e2"} Feb 19 10:27:20 crc kubenswrapper[4873]: I0219 10:27:20.973181 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:27:20 crc kubenswrapper[4873]: I0219 10:27:20.976812 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-inventory\") pod \"bf143721-2963-4009-8e23-0c283b4a88a3\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " Feb 19 10:27:20 crc kubenswrapper[4873]: I0219 10:27:20.976895 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-2\") pod \"bf143721-2963-4009-8e23-0c283b4a88a3\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " Feb 19 10:27:20 crc kubenswrapper[4873]: I0219 10:27:20.976935 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-0\") pod \"bf143721-2963-4009-8e23-0c283b4a88a3\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " Feb 19 10:27:20 crc kubenswrapper[4873]: I0219 10:27:20.977035 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-telemetry-combined-ca-bundle\") pod \"bf143721-2963-4009-8e23-0c283b4a88a3\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " Feb 19 10:27:20 crc kubenswrapper[4873]: I0219 10:27:20.977076 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ssh-key-openstack-edpm-ipam\") pod \"bf143721-2963-4009-8e23-0c283b4a88a3\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " Feb 19 10:27:20 crc kubenswrapper[4873]: I0219 10:27:20.977184 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99k8b\" (UniqueName: \"kubernetes.io/projected/bf143721-2963-4009-8e23-0c283b4a88a3-kube-api-access-99k8b\") pod \"bf143721-2963-4009-8e23-0c283b4a88a3\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " Feb 19 10:27:20 crc kubenswrapper[4873]: I0219 10:27:20.977288 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-1\") pod \"bf143721-2963-4009-8e23-0c283b4a88a3\" (UID: \"bf143721-2963-4009-8e23-0c283b4a88a3\") " Feb 19 10:27:20 crc kubenswrapper[4873]: I0219 10:27:20.984270 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf143721-2963-4009-8e23-0c283b4a88a3-kube-api-access-99k8b" (OuterVolumeSpecName: "kube-api-access-99k8b") pod "bf143721-2963-4009-8e23-0c283b4a88a3" (UID: "bf143721-2963-4009-8e23-0c283b4a88a3"). InnerVolumeSpecName "kube-api-access-99k8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:27:20 crc kubenswrapper[4873]: I0219 10:27:20.984352 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "bf143721-2963-4009-8e23-0c283b4a88a3" (UID: "bf143721-2963-4009-8e23-0c283b4a88a3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.019907 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "bf143721-2963-4009-8e23-0c283b4a88a3" (UID: "bf143721-2963-4009-8e23-0c283b4a88a3"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.023432 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-inventory" (OuterVolumeSpecName: "inventory") pod "bf143721-2963-4009-8e23-0c283b4a88a3" (UID: "bf143721-2963-4009-8e23-0c283b4a88a3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.027325 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "bf143721-2963-4009-8e23-0c283b4a88a3" (UID: "bf143721-2963-4009-8e23-0c283b4a88a3"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.032713 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "bf143721-2963-4009-8e23-0c283b4a88a3" (UID: "bf143721-2963-4009-8e23-0c283b4a88a3"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.033276 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf143721-2963-4009-8e23-0c283b4a88a3" (UID: "bf143721-2963-4009-8e23-0c283b4a88a3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.079164 4873 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.079282 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.079454 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99k8b\" (UniqueName: \"kubernetes.io/projected/bf143721-2963-4009-8e23-0c283b4a88a3-kube-api-access-99k8b\") on node \"crc\" DevicePath \"\"" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.079516 4873 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.079591 4873 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.079648 4873 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.079728 4873 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bf143721-2963-4009-8e23-0c283b4a88a3-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.538073 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" event={"ID":"bf143721-2963-4009-8e23-0c283b4a88a3","Type":"ContainerDied","Data":"ec7e10e73d578ccea86cceec7a80c2f08166bb6d93976658c29e6c98ef292cac"} Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.538126 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec7e10e73d578ccea86cceec7a80c2f08166bb6d93976658c29e6c98ef292cac" Feb 19 10:27:21 crc kubenswrapper[4873]: I0219 10:27:21.538161 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.880315 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 19 10:27:53 crc kubenswrapper[4873]: E0219 10:27:53.881416 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf143721-2963-4009-8e23-0c283b4a88a3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.881439 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf143721-2963-4009-8e23-0c283b4a88a3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 10:27:53 crc kubenswrapper[4873]: E0219 10:27:53.881469 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" containerName="extract-utilities" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.881478 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" containerName="extract-utilities" Feb 19 10:27:53 crc kubenswrapper[4873]: E0219 10:27:53.881509 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" containerName="registry-server" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.881517 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" containerName="registry-server" Feb 19 10:27:53 crc kubenswrapper[4873]: E0219 10:27:53.881547 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" containerName="extract-content" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.881555 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" containerName="extract-content" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.881812 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf143721-2963-4009-8e23-0c283b4a88a3" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.881836 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="92377803-fb7e-42d1-ba93-54235a8f9409" containerName="registry-server" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.887508 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.889957 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.899294 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.960001 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.961931 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.964951 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.984132 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.997839 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.997893 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-scripts\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.997925 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.997950 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.997972 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-sys\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.997986 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998005 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-lib-modules\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998024 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998044 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998064 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-dev\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998133 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-run\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998149 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998161 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-run\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998190 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-dev\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998209 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998229 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998247 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998270 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-sys\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998287 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998312 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998340 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-config-data\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998355 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998378 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998392 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998419 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72k9m\" (UniqueName: \"kubernetes.io/projected/312e766d-4086-4bab-bf8f-9a154f1da5b5-kube-api-access-72k9m\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998435 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998456 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998471 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hxjg\" (UniqueName: \"kubernetes.io/projected/717b3122-e7c6-4cbe-8528-4b582dd7adc5-kube-api-access-4hxjg\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998485 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:53 crc kubenswrapper[4873]: I0219 10:27:53.998508 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.035618 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.037357 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.039485 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.050629 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100502 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100581 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-config-data\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100639 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgb6v\" (UniqueName: \"kubernetes.io/projected/8268173a-e7be-4edd-a1e8-bed3486b138e-kube-api-access-jgb6v\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100664 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100702 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100751 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100776 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100812 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100822 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72k9m\" (UniqueName: \"kubernetes.io/projected/312e766d-4086-4bab-bf8f-9a154f1da5b5-kube-api-access-72k9m\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100847 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100875 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100928 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100958 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hxjg\" (UniqueName: \"kubernetes.io/projected/717b3122-e7c6-4cbe-8528-4b582dd7adc5-kube-api-access-4hxjg\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100975 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.100994 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101052 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101072 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101087 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101145 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101201 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-scripts\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101243 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101272 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101294 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101325 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101356 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-sys\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101375 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101394 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101426 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-lib-modules\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101446 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101471 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101485 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101512 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101543 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-dev\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101589 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-run\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101600 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101605 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101648 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-run\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101719 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101727 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101758 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-run\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101778 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-dev\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101795 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-sys\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101797 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101843 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101878 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101914 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.101939 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.102015 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-sys\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.102058 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.102147 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.102175 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.102274 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.102389 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.102762 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-run\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.102830 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-sys\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.102936 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-dev\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.103043 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.103168 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.103278 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.103409 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.103533 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.103638 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-lib-modules\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.103822 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-dev\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.103986 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/312e766d-4086-4bab-bf8f-9a154f1da5b5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.104046 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/717b3122-e7c6-4cbe-8528-4b582dd7adc5-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.107961 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.108379 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.110912 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.111935 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-config-data\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.113323 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312e766d-4086-4bab-bf8f-9a154f1da5b5-scripts\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.113593 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.114034 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.118903 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/717b3122-e7c6-4cbe-8528-4b582dd7adc5-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.119651 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72k9m\" (UniqueName: \"kubernetes.io/projected/312e766d-4086-4bab-bf8f-9a154f1da5b5-kube-api-access-72k9m\") pod \"cinder-backup-0\" (UID: \"312e766d-4086-4bab-bf8f-9a154f1da5b5\") " pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.119756 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hxjg\" (UniqueName: \"kubernetes.io/projected/717b3122-e7c6-4cbe-8528-4b582dd7adc5-kube-api-access-4hxjg\") pod \"cinder-volume-nfs-0\" (UID: \"717b3122-e7c6-4cbe-8528-4b582dd7adc5\") " pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205463 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205521 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205536 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205581 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205604 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205630 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205646 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205665 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205709 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205732 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205751 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205787 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205810 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205876 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgb6v\" (UniqueName: \"kubernetes.io/projected/8268173a-e7be-4edd-a1e8-bed3486b138e-kube-api-access-jgb6v\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205895 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.205984 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.206038 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.206090 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.206579 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.206613 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.206614 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.206634 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.206695 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.206924 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.207006 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8268173a-e7be-4edd-a1e8-bed3486b138e-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.213222 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.213901 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.213963 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.214574 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8268173a-e7be-4edd-a1e8-bed3486b138e-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.228300 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgb6v\" (UniqueName: \"kubernetes.io/projected/8268173a-e7be-4edd-a1e8-bed3486b138e-kube-api-access-jgb6v\") pod \"cinder-volume-nfs-2-0\" (UID: \"8268173a-e7be-4edd-a1e8-bed3486b138e\") " pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.241278 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.277899 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.357989 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.897463 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 19 10:27:54 crc kubenswrapper[4873]: I0219 10:27:54.910679 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"312e766d-4086-4bab-bf8f-9a154f1da5b5","Type":"ContainerStarted","Data":"961f4c4eddf24a0513e2fe9b761a0202865e1cc843dc3d1c04366638dd0088ae"} Feb 19 10:27:55 crc kubenswrapper[4873]: I0219 10:27:55.019344 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Feb 19 10:27:55 crc kubenswrapper[4873]: I0219 10:27:55.102917 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Feb 19 10:27:55 crc kubenswrapper[4873]: W0219 10:27:55.194253 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8268173a_e7be_4edd_a1e8_bed3486b138e.slice/crio-c03e5045e9e471dde8e7d32d98b14fbc330b1ed6fba2f3d47609cc01cbd1b15a WatchSource:0}: Error finding container c03e5045e9e471dde8e7d32d98b14fbc330b1ed6fba2f3d47609cc01cbd1b15a: Status 404 returned error can't find the container with id c03e5045e9e471dde8e7d32d98b14fbc330b1ed6fba2f3d47609cc01cbd1b15a Feb 19 10:27:55 crc kubenswrapper[4873]: I0219 10:27:55.923955 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"312e766d-4086-4bab-bf8f-9a154f1da5b5","Type":"ContainerStarted","Data":"ecbd2535a2e827e730f4f5e040504890874fa719312f4961e2a232ae0dd6038a"} Feb 19 10:27:55 crc kubenswrapper[4873]: I0219 10:27:55.926301 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"717b3122-e7c6-4cbe-8528-4b582dd7adc5","Type":"ContainerStarted","Data":"c3f784c8d49cdf2e2dcbbe10ee20f040fc0f6e1ec216e5a6ed15a35dec145ff5"} Feb 19 10:27:55 crc kubenswrapper[4873]: I0219 10:27:55.926345 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"717b3122-e7c6-4cbe-8528-4b582dd7adc5","Type":"ContainerStarted","Data":"355767f3ecdcb659b330cd062c004359cba04a7d72d50415f1ac2f4cadee3afe"} Feb 19 10:27:55 crc kubenswrapper[4873]: I0219 10:27:55.927165 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"8268173a-e7be-4edd-a1e8-bed3486b138e","Type":"ContainerStarted","Data":"c03e5045e9e471dde8e7d32d98b14fbc330b1ed6fba2f3d47609cc01cbd1b15a"} Feb 19 10:27:56 crc kubenswrapper[4873]: I0219 10:27:56.953853 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"312e766d-4086-4bab-bf8f-9a154f1da5b5","Type":"ContainerStarted","Data":"c63855b80845073b98d0858b03ab13806fae9cae4d9a0b9fc97839acc005f5d7"} Feb 19 10:27:56 crc kubenswrapper[4873]: I0219 10:27:56.969400 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"717b3122-e7c6-4cbe-8528-4b582dd7adc5","Type":"ContainerStarted","Data":"4fbd9a9615b0818985931b63b4655fecaed536d5212f0144603bba625756797f"} Feb 19 10:27:56 crc kubenswrapper[4873]: I0219 10:27:56.972639 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"8268173a-e7be-4edd-a1e8-bed3486b138e","Type":"ContainerStarted","Data":"e0a9ab4d817ce1eeddae293a19f306829951931358e2b315242de292c3e25eba"} Feb 19 10:27:56 crc kubenswrapper[4873]: I0219 10:27:56.972678 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"8268173a-e7be-4edd-a1e8-bed3486b138e","Type":"ContainerStarted","Data":"e99a7348a838c4370c5513b26f7f2df0eae85f119ca65f1ebddf578dd3416115"} Feb 19 10:27:56 crc kubenswrapper[4873]: I0219 10:27:56.985900 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.576394156 podStartE2EDuration="3.985882955s" podCreationTimestamp="2026-02-19 10:27:53 +0000 UTC" firstStartedPulling="2026-02-19 10:27:54.905867178 +0000 UTC m=+2584.195298816" lastFinishedPulling="2026-02-19 10:27:55.315355977 +0000 UTC m=+2584.604787615" observedRunningTime="2026-02-19 10:27:56.981757132 +0000 UTC m=+2586.271188780" watchObservedRunningTime="2026-02-19 10:27:56.985882955 +0000 UTC m=+2586.275314593" Feb 19 10:27:57 crc kubenswrapper[4873]: I0219 10:27:57.042859 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=2.588840036 podStartE2EDuration="3.042839057s" podCreationTimestamp="2026-02-19 10:27:54 +0000 UTC" firstStartedPulling="2026-02-19 10:27:55.234565119 +0000 UTC m=+2584.523996757" lastFinishedPulling="2026-02-19 10:27:55.68856414 +0000 UTC m=+2584.977995778" observedRunningTime="2026-02-19 10:27:57.032613412 +0000 UTC m=+2586.322045050" watchObservedRunningTime="2026-02-19 10:27:57.042839057 +0000 UTC m=+2586.332270695" Feb 19 10:27:57 crc kubenswrapper[4873]: I0219 10:27:57.045446 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=3.540261974 podStartE2EDuration="4.045435502s" podCreationTimestamp="2026-02-19 10:27:53 +0000 UTC" firstStartedPulling="2026-02-19 10:27:55.178973331 +0000 UTC m=+2584.468404969" lastFinishedPulling="2026-02-19 10:27:55.684146859 +0000 UTC m=+2584.973578497" observedRunningTime="2026-02-19 10:27:57.00890498 +0000 UTC m=+2586.298336618" watchObservedRunningTime="2026-02-19 10:27:57.045435502 +0000 UTC m=+2586.334867140" Feb 19 10:27:59 crc kubenswrapper[4873]: I0219 10:27:59.241653 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 19 10:27:59 crc kubenswrapper[4873]: I0219 10:27:59.278700 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Feb 19 10:27:59 crc kubenswrapper[4873]: I0219 10:27:59.358913 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:28:04 crc kubenswrapper[4873]: I0219 10:28:04.490717 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Feb 19 10:28:04 crc kubenswrapper[4873]: I0219 10:28:04.571079 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Feb 19 10:28:04 crc kubenswrapper[4873]: I0219 10:28:04.620661 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 19 10:29:03 crc kubenswrapper[4873]: I0219 10:29:03.331848 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:29:03 crc kubenswrapper[4873]: I0219 10:29:03.332753 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="prometheus" containerID="cri-o://527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f" gracePeriod=600 Feb 19 10:29:03 crc kubenswrapper[4873]: I0219 10:29:03.333406 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="thanos-sidecar" containerID="cri-o://e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573" gracePeriod=600 Feb 19 10:29:03 crc kubenswrapper[4873]: I0219 10:29:03.333466 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="config-reloader" containerID="cri-o://92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980" gracePeriod=600 Feb 19 10:29:03 crc kubenswrapper[4873]: I0219 10:29:03.607123 4873 generic.go:334] "Generic (PLEG): container finished" podID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerID="e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573" exitCode=0 Feb 19 10:29:03 crc kubenswrapper[4873]: I0219 10:29:03.607628 4873 generic.go:334] "Generic (PLEG): container finished" podID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerID="527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f" exitCode=0 Feb 19 10:29:03 crc kubenswrapper[4873]: I0219 10:29:03.607201 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d1070e0c-7518-4d1b-bbb8-e56db1cad28a","Type":"ContainerDied","Data":"e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573"} Feb 19 10:29:03 crc kubenswrapper[4873]: I0219 10:29:03.607832 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d1070e0c-7518-4d1b-bbb8-e56db1cad28a","Type":"ContainerDied","Data":"527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f"} Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.381213 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.471715 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-tls-assets\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.471819 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-thanos-prometheus-http-client-file\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.471850 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-secret-combined-ca-bundle\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.471898 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snd6v\" (UniqueName: \"kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-kube-api-access-snd6v\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.471944 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-2\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.471970 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-0\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.471996 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.472261 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.472350 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.472376 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-1\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.472438 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.472493 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.472519 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config-out\") pod \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\" (UID: \"d1070e0c-7518-4d1b-bbb8-e56db1cad28a\") " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.476082 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.484020 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.485019 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config-out" (OuterVolumeSpecName: "config-out") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.485154 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.493382 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.496088 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-kube-api-access-snd6v" (OuterVolumeSpecName: "kube-api-access-snd6v") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "kube-api-access-snd6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.497498 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config" (OuterVolumeSpecName: "config") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.497600 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.508222 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.509299 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.536306 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.636086 4873 generic.go:334] "Generic (PLEG): container finished" podID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerID="92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980" exitCode=0 Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.636157 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d1070e0c-7518-4d1b-bbb8-e56db1cad28a","Type":"ContainerDied","Data":"92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980"} Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.636192 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d1070e0c-7518-4d1b-bbb8-e56db1cad28a","Type":"ContainerDied","Data":"67a74c25d6b44ccd6cb397b300a6cd2025bf7fa88890d389ef81197cfb4ef22d"} Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.636228 4873 scope.go:117] "RemoveContainer" containerID="e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.636459 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.637407 4873 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.640716 4873 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.640802 4873 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.640817 4873 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.640826 4873 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-config-out\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.640834 4873 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.640848 4873 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.640866 4873 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.640876 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snd6v\" (UniqueName: \"kubernetes.io/projected/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-kube-api-access-snd6v\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.640895 4873 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.640909 4873 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.654008 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.712885 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config" (OuterVolumeSpecName: "web-config") pod "d1070e0c-7518-4d1b-bbb8-e56db1cad28a" (UID: "d1070e0c-7518-4d1b-bbb8-e56db1cad28a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.743256 4873 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") on node \"crc\" " Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.743299 4873 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1070e0c-7518-4d1b-bbb8-e56db1cad28a-web-config\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.766676 4873 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.767575 4873 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d") on node "crc" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.821456 4873 scope.go:117] "RemoveContainer" containerID="92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.843970 4873 scope.go:117] "RemoveContainer" containerID="527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.845782 4873 reconciler_common.go:293] "Volume detached for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") on node \"crc\" DevicePath \"\"" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.862160 4873 scope.go:117] "RemoveContainer" containerID="5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.907847 4873 scope.go:117] "RemoveContainer" containerID="e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573" Feb 19 10:29:04 crc kubenswrapper[4873]: E0219 10:29:04.908347 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573\": container with ID starting with e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573 not found: ID does not exist" containerID="e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.908383 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573"} err="failed to get container status \"e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573\": rpc error: code = NotFound desc = could not find container \"e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573\": container with ID starting with e807a1b081c19c6059b7e493eef6d50be38743a8a7b0942d44ce78b41a711573 not found: ID does not exist" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.908411 4873 scope.go:117] "RemoveContainer" containerID="92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980" Feb 19 10:29:04 crc kubenswrapper[4873]: E0219 10:29:04.908645 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980\": container with ID starting with 92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980 not found: ID does not exist" containerID="92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.908677 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980"} err="failed to get container status \"92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980\": rpc error: code = NotFound desc = could not find container \"92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980\": container with ID starting with 92436e82b0191e1ee2ed056fcf87daa473f88f898d89c02a039d222108f86980 not found: ID does not exist" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.908695 4873 scope.go:117] "RemoveContainer" containerID="527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f" Feb 19 10:29:04 crc kubenswrapper[4873]: E0219 10:29:04.909042 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f\": container with ID starting with 527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f not found: ID does not exist" containerID="527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.909073 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f"} err="failed to get container status \"527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f\": rpc error: code = NotFound desc = could not find container \"527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f\": container with ID starting with 527fb1a245b8b03bec5d82faa5a774b17399f65786d8762162ef556987d4ff0f not found: ID does not exist" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.909092 4873 scope.go:117] "RemoveContainer" containerID="5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52" Feb 19 10:29:04 crc kubenswrapper[4873]: E0219 10:29:04.909575 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52\": container with ID starting with 5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52 not found: ID does not exist" containerID="5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.909602 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52"} err="failed to get container status \"5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52\": rpc error: code = NotFound desc = could not find container \"5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52\": container with ID starting with 5dba258377743a5715353e87db5d11c9c4d59b17ccf94ca70c8d425eaaad3a52 not found: ID does not exist" Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.974748 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:29:04 crc kubenswrapper[4873]: I0219 10:29:04.983166 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.001381 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:29:05 crc kubenswrapper[4873]: E0219 10:29:05.001863 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="config-reloader" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.001887 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="config-reloader" Feb 19 10:29:05 crc kubenswrapper[4873]: E0219 10:29:05.001929 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="thanos-sidecar" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.001939 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="thanos-sidecar" Feb 19 10:29:05 crc kubenswrapper[4873]: E0219 10:29:05.001956 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="prometheus" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.001964 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="prometheus" Feb 19 10:29:05 crc kubenswrapper[4873]: E0219 10:29:05.001980 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="init-config-reloader" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.001989 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="init-config-reloader" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.002885 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="thanos-sidecar" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.002911 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="config-reloader" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.002945 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" containerName="prometheus" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.005318 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.010402 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.010481 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.010541 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.010541 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.011573 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.011594 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.013379 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-stpz9" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.021226 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.025086 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.153855 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ae630a8f-ee42-4f96-adb9-d18bf713af37-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.153972 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.153996 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ae630a8f-ee42-4f96-adb9-d18bf713af37-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.154026 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.154059 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlkc5\" (UniqueName: \"kubernetes.io/projected/ae630a8f-ee42-4f96-adb9-d18bf713af37-kube-api-access-xlkc5\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.154134 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.154165 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ae630a8f-ee42-4f96-adb9-d18bf713af37-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.154185 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-config\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.154253 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.154361 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ae630a8f-ee42-4f96-adb9-d18bf713af37-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.154420 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.154488 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.154507 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ae630a8f-ee42-4f96-adb9-d18bf713af37-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.256210 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.256255 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ae630a8f-ee42-4f96-adb9-d18bf713af37-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.256287 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.256326 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlkc5\" (UniqueName: \"kubernetes.io/projected/ae630a8f-ee42-4f96-adb9-d18bf713af37-kube-api-access-xlkc5\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.256344 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.256375 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ae630a8f-ee42-4f96-adb9-d18bf713af37-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.256390 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-config\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.256422 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.256472 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ae630a8f-ee42-4f96-adb9-d18bf713af37-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.256499 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.257365 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ae630a8f-ee42-4f96-adb9-d18bf713af37-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.257405 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ae630a8f-ee42-4f96-adb9-d18bf713af37-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.257489 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ae630a8f-ee42-4f96-adb9-d18bf713af37-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.257511 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ae630a8f-ee42-4f96-adb9-d18bf713af37-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.257517 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.257726 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ae630a8f-ee42-4f96-adb9-d18bf713af37-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.262053 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ae630a8f-ee42-4f96-adb9-d18bf713af37-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.262338 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.263231 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.264511 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ae630a8f-ee42-4f96-adb9-d18bf713af37-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.264992 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.265180 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.267652 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-config\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.267864 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ae630a8f-ee42-4f96-adb9-d18bf713af37-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.268974 4873 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.269001 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/668a04d4437b4137f130ddea3fc0a68c22db655664b336b39ceb124bf62a44ab/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.278138 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlkc5\" (UniqueName: \"kubernetes.io/projected/ae630a8f-ee42-4f96-adb9-d18bf713af37-kube-api-access-xlkc5\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.321488 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53cd3a8c-3594-457a-b730-9dd2241fe45d\") pod \"prometheus-metric-storage-0\" (UID: \"ae630a8f-ee42-4f96-adb9-d18bf713af37\") " pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.495896 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1070e0c-7518-4d1b-bbb8-e56db1cad28a" path="/var/lib/kubelet/pods/d1070e0c-7518-4d1b-bbb8-e56db1cad28a/volumes" Feb 19 10:29:05 crc kubenswrapper[4873]: I0219 10:29:05.623360 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:06 crc kubenswrapper[4873]: I0219 10:29:06.158361 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 10:29:06 crc kubenswrapper[4873]: I0219 10:29:06.657779 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ae630a8f-ee42-4f96-adb9-d18bf713af37","Type":"ContainerStarted","Data":"431013b032f4126b2744293e55d0bfd32e67adcddf91411f3b1f8d94f1b15cd7"} Feb 19 10:29:09 crc kubenswrapper[4873]: I0219 10:29:09.684484 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ae630a8f-ee42-4f96-adb9-d18bf713af37","Type":"ContainerStarted","Data":"dad54f9a71fb953da71ad6657f7fd90a4bb9946b85d99390456690e39db803c2"} Feb 19 10:29:15 crc kubenswrapper[4873]: I0219 10:29:15.738465 4873 generic.go:334] "Generic (PLEG): container finished" podID="ae630a8f-ee42-4f96-adb9-d18bf713af37" containerID="dad54f9a71fb953da71ad6657f7fd90a4bb9946b85d99390456690e39db803c2" exitCode=0 Feb 19 10:29:15 crc kubenswrapper[4873]: I0219 10:29:15.738551 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ae630a8f-ee42-4f96-adb9-d18bf713af37","Type":"ContainerDied","Data":"dad54f9a71fb953da71ad6657f7fd90a4bb9946b85d99390456690e39db803c2"} Feb 19 10:29:16 crc kubenswrapper[4873]: I0219 10:29:16.755720 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ae630a8f-ee42-4f96-adb9-d18bf713af37","Type":"ContainerStarted","Data":"a52a629df92a35867f1899536b8ad19d872e8ccf3d9767f467994eae589fe27f"} Feb 19 10:29:18 crc kubenswrapper[4873]: I0219 10:29:18.240659 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:29:18 crc kubenswrapper[4873]: I0219 10:29:18.241248 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:29:19 crc kubenswrapper[4873]: I0219 10:29:19.801818 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ae630a8f-ee42-4f96-adb9-d18bf713af37","Type":"ContainerStarted","Data":"7d28acf3957d0a323b9145c1139424486a892430cccbdca5fddf144d4f3ea371"} Feb 19 10:29:19 crc kubenswrapper[4873]: I0219 10:29:19.802210 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ae630a8f-ee42-4f96-adb9-d18bf713af37","Type":"ContainerStarted","Data":"7d779ac110a22a90f33893509862674bc0b9550c83fc6e2befa3abc140c292fd"} Feb 19 10:29:19 crc kubenswrapper[4873]: I0219 10:29:19.852508 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.852484569 podStartE2EDuration="15.852484569s" podCreationTimestamp="2026-02-19 10:29:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:29:19.845751319 +0000 UTC m=+2669.135182957" watchObservedRunningTime="2026-02-19 10:29:19.852484569 +0000 UTC m=+2669.141916207" Feb 19 10:29:20 crc kubenswrapper[4873]: I0219 10:29:20.624908 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:20 crc kubenswrapper[4873]: I0219 10:29:20.625485 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:20 crc kubenswrapper[4873]: I0219 10:29:20.633655 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:20 crc kubenswrapper[4873]: I0219 10:29:20.819439 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 10:29:48 crc kubenswrapper[4873]: I0219 10:29:48.240916 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:29:48 crc kubenswrapper[4873]: I0219 10:29:48.241657 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.201797 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.203726 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.206753 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.207443 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.207930 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.208353 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5bdht" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.213577 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.375334 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.375388 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.375410 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-config-data\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.375467 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.375552 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.375571 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.375595 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.375615 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.375639 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdqm6\" (UniqueName: \"kubernetes.io/projected/5e5a79da-a068-4a68-ba79-6719ea0fb353-kube-api-access-hdqm6\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.477443 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.477835 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.477978 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.478130 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.478306 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdqm6\" (UniqueName: \"kubernetes.io/projected/5e5a79da-a068-4a68-ba79-6719ea0fb353-kube-api-access-hdqm6\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.478525 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.478659 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.478785 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-config-data\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.478335 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.478977 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.479316 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.479504 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.479702 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.480303 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-config-data\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.484488 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.485242 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.485342 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.504365 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdqm6\" (UniqueName: \"kubernetes.io/projected/5e5a79da-a068-4a68-ba79-6719ea0fb353-kube-api-access-hdqm6\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.513495 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.522802 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 10:29:52 crc kubenswrapper[4873]: I0219 10:29:52.974595 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 10:29:53 crc kubenswrapper[4873]: I0219 10:29:53.126725 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e5a79da-a068-4a68-ba79-6719ea0fb353","Type":"ContainerStarted","Data":"2ab42e52f993d6514497f49e7da17659fa93e4ac5da7295a0f0f52c753b83b71"} Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.141067 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k"] Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.143528 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.147456 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.147602 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.157159 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k"] Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.244012 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e250d05-a293-4a3c-8658-99d1ae2dc894-config-volume\") pod \"collect-profiles-29524950-pp89k\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.244178 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bvtm\" (UniqueName: \"kubernetes.io/projected/9e250d05-a293-4a3c-8658-99d1ae2dc894-kube-api-access-4bvtm\") pod \"collect-profiles-29524950-pp89k\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.244218 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e250d05-a293-4a3c-8658-99d1ae2dc894-secret-volume\") pod \"collect-profiles-29524950-pp89k\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.347921 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e250d05-a293-4a3c-8658-99d1ae2dc894-config-volume\") pod \"collect-profiles-29524950-pp89k\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.348155 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bvtm\" (UniqueName: \"kubernetes.io/projected/9e250d05-a293-4a3c-8658-99d1ae2dc894-kube-api-access-4bvtm\") pod \"collect-profiles-29524950-pp89k\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.348213 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e250d05-a293-4a3c-8658-99d1ae2dc894-secret-volume\") pod \"collect-profiles-29524950-pp89k\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.353805 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e250d05-a293-4a3c-8658-99d1ae2dc894-config-volume\") pod \"collect-profiles-29524950-pp89k\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.363880 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bvtm\" (UniqueName: \"kubernetes.io/projected/9e250d05-a293-4a3c-8658-99d1ae2dc894-kube-api-access-4bvtm\") pod \"collect-profiles-29524950-pp89k\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.368956 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e250d05-a293-4a3c-8658-99d1ae2dc894-secret-volume\") pod \"collect-profiles-29524950-pp89k\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:00 crc kubenswrapper[4873]: I0219 10:30:00.471020 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:02 crc kubenswrapper[4873]: I0219 10:30:02.685340 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k"] Feb 19 10:30:03 crc kubenswrapper[4873]: I0219 10:30:03.228676 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" event={"ID":"9e250d05-a293-4a3c-8658-99d1ae2dc894","Type":"ContainerStarted","Data":"97237f992c0b70ff79ac5f913c59bb566d1f47e053b9c438613bd77bb3e8a5fe"} Feb 19 10:30:03 crc kubenswrapper[4873]: I0219 10:30:03.229010 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" event={"ID":"9e250d05-a293-4a3c-8658-99d1ae2dc894","Type":"ContainerStarted","Data":"af7c4713ce82bd54098f28780ef63c539c161e9088aec99b9673ec780d2c6e07"} Feb 19 10:30:03 crc kubenswrapper[4873]: I0219 10:30:03.255412 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" podStartSLOduration=3.255392707 podStartE2EDuration="3.255392707s" podCreationTimestamp="2026-02-19 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:30:03.242286008 +0000 UTC m=+2712.531717646" watchObservedRunningTime="2026-02-19 10:30:03.255392707 +0000 UTC m=+2712.544824345" Feb 19 10:30:04 crc kubenswrapper[4873]: I0219 10:30:04.238380 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e5a79da-a068-4a68-ba79-6719ea0fb353","Type":"ContainerStarted","Data":"edd9b7584d145cddbcf9d8449ca8d5546aa8224b7f3731235eeab85ccb091862"} Feb 19 10:30:04 crc kubenswrapper[4873]: I0219 10:30:04.241210 4873 generic.go:334] "Generic (PLEG): container finished" podID="9e250d05-a293-4a3c-8658-99d1ae2dc894" containerID="97237f992c0b70ff79ac5f913c59bb566d1f47e053b9c438613bd77bb3e8a5fe" exitCode=0 Feb 19 10:30:04 crc kubenswrapper[4873]: I0219 10:30:04.241259 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" event={"ID":"9e250d05-a293-4a3c-8658-99d1ae2dc894","Type":"ContainerDied","Data":"97237f992c0b70ff79ac5f913c59bb566d1f47e053b9c438613bd77bb3e8a5fe"} Feb 19 10:30:04 crc kubenswrapper[4873]: I0219 10:30:04.256736 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.99174221 podStartE2EDuration="13.256714072s" podCreationTimestamp="2026-02-19 10:29:51 +0000 UTC" firstStartedPulling="2026-02-19 10:29:52.978986846 +0000 UTC m=+2702.268418484" lastFinishedPulling="2026-02-19 10:30:02.243958708 +0000 UTC m=+2711.533390346" observedRunningTime="2026-02-19 10:30:04.25463747 +0000 UTC m=+2713.544069128" watchObservedRunningTime="2026-02-19 10:30:04.256714072 +0000 UTC m=+2713.546145720" Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.637310 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.755778 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw"] Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.761511 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e250d05-a293-4a3c-8658-99d1ae2dc894-config-volume\") pod \"9e250d05-a293-4a3c-8658-99d1ae2dc894\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.761761 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bvtm\" (UniqueName: \"kubernetes.io/projected/9e250d05-a293-4a3c-8658-99d1ae2dc894-kube-api-access-4bvtm\") pod \"9e250d05-a293-4a3c-8658-99d1ae2dc894\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.761949 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e250d05-a293-4a3c-8658-99d1ae2dc894-config-volume" (OuterVolumeSpecName: "config-volume") pod "9e250d05-a293-4a3c-8658-99d1ae2dc894" (UID: "9e250d05-a293-4a3c-8658-99d1ae2dc894"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.762625 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e250d05-a293-4a3c-8658-99d1ae2dc894-secret-volume\") pod \"9e250d05-a293-4a3c-8658-99d1ae2dc894\" (UID: \"9e250d05-a293-4a3c-8658-99d1ae2dc894\") " Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.763296 4873 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e250d05-a293-4a3c-8658-99d1ae2dc894-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.765324 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524905-jqdfw"] Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.775336 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e250d05-a293-4a3c-8658-99d1ae2dc894-kube-api-access-4bvtm" (OuterVolumeSpecName: "kube-api-access-4bvtm") pod "9e250d05-a293-4a3c-8658-99d1ae2dc894" (UID: "9e250d05-a293-4a3c-8658-99d1ae2dc894"). InnerVolumeSpecName "kube-api-access-4bvtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.775984 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e250d05-a293-4a3c-8658-99d1ae2dc894-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9e250d05-a293-4a3c-8658-99d1ae2dc894" (UID: "9e250d05-a293-4a3c-8658-99d1ae2dc894"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.865681 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bvtm\" (UniqueName: \"kubernetes.io/projected/9e250d05-a293-4a3c-8658-99d1ae2dc894-kube-api-access-4bvtm\") on node \"crc\" DevicePath \"\"" Feb 19 10:30:05 crc kubenswrapper[4873]: I0219 10:30:05.865708 4873 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e250d05-a293-4a3c-8658-99d1ae2dc894-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 10:30:06 crc kubenswrapper[4873]: I0219 10:30:06.265094 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" event={"ID":"9e250d05-a293-4a3c-8658-99d1ae2dc894","Type":"ContainerDied","Data":"af7c4713ce82bd54098f28780ef63c539c161e9088aec99b9673ec780d2c6e07"} Feb 19 10:30:06 crc kubenswrapper[4873]: I0219 10:30:06.265150 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af7c4713ce82bd54098f28780ef63c539c161e9088aec99b9673ec780d2c6e07" Feb 19 10:30:06 crc kubenswrapper[4873]: I0219 10:30:06.265208 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k" Feb 19 10:30:07 crc kubenswrapper[4873]: I0219 10:30:07.496970 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de77b9aa-b558-4431-b116-5e1e1cc116f3" path="/var/lib/kubelet/pods/de77b9aa-b558-4431-b116-5e1e1cc116f3/volumes" Feb 19 10:30:18 crc kubenswrapper[4873]: I0219 10:30:18.240214 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:30:18 crc kubenswrapper[4873]: I0219 10:30:18.240730 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:30:18 crc kubenswrapper[4873]: I0219 10:30:18.240787 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:30:18 crc kubenswrapper[4873]: I0219 10:30:18.241628 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"edec24981d97e1beda63a2d9013b9abdb1a1dbeed2c76ab65161659d51d3be20"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:30:18 crc kubenswrapper[4873]: I0219 10:30:18.241680 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://edec24981d97e1beda63a2d9013b9abdb1a1dbeed2c76ab65161659d51d3be20" gracePeriod=600 Feb 19 10:30:18 crc kubenswrapper[4873]: I0219 10:30:18.443800 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="edec24981d97e1beda63a2d9013b9abdb1a1dbeed2c76ab65161659d51d3be20" exitCode=0 Feb 19 10:30:18 crc kubenswrapper[4873]: I0219 10:30:18.443892 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"edec24981d97e1beda63a2d9013b9abdb1a1dbeed2c76ab65161659d51d3be20"} Feb 19 10:30:18 crc kubenswrapper[4873]: I0219 10:30:18.444091 4873 scope.go:117] "RemoveContainer" containerID="dca79e332cae1f6387c5683fc865ffa663ef7876e44a7e998dd322e8f3f6a806" Feb 19 10:30:19 crc kubenswrapper[4873]: I0219 10:30:19.453869 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b"} Feb 19 10:31:02 crc kubenswrapper[4873]: I0219 10:31:02.180193 4873 scope.go:117] "RemoveContainer" containerID="e60bc2f916aff75454f8db4d5b15c6ae005baebfdcb79c0c87df06d3a9db5142" Feb 19 10:31:29 crc kubenswrapper[4873]: I0219 10:31:29.821144 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rd7vw"] Feb 19 10:31:29 crc kubenswrapper[4873]: E0219 10:31:29.822242 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e250d05-a293-4a3c-8658-99d1ae2dc894" containerName="collect-profiles" Feb 19 10:31:29 crc kubenswrapper[4873]: I0219 10:31:29.822258 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e250d05-a293-4a3c-8658-99d1ae2dc894" containerName="collect-profiles" Feb 19 10:31:29 crc kubenswrapper[4873]: I0219 10:31:29.822509 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e250d05-a293-4a3c-8658-99d1ae2dc894" containerName="collect-profiles" Feb 19 10:31:29 crc kubenswrapper[4873]: I0219 10:31:29.824185 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:29 crc kubenswrapper[4873]: I0219 10:31:29.845926 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rd7vw"] Feb 19 10:31:29 crc kubenswrapper[4873]: I0219 10:31:29.937830 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-catalog-content\") pod \"community-operators-rd7vw\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:29 crc kubenswrapper[4873]: I0219 10:31:29.937922 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vst\" (UniqueName: \"kubernetes.io/projected/4ae001dc-8355-422d-909f-d7eb1f4e80fe-kube-api-access-44vst\") pod \"community-operators-rd7vw\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:29 crc kubenswrapper[4873]: I0219 10:31:29.938138 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-utilities\") pod \"community-operators-rd7vw\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:30 crc kubenswrapper[4873]: I0219 10:31:30.040166 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-utilities\") pod \"community-operators-rd7vw\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:30 crc kubenswrapper[4873]: I0219 10:31:30.040229 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-catalog-content\") pod \"community-operators-rd7vw\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:30 crc kubenswrapper[4873]: I0219 10:31:30.040288 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44vst\" (UniqueName: \"kubernetes.io/projected/4ae001dc-8355-422d-909f-d7eb1f4e80fe-kube-api-access-44vst\") pod \"community-operators-rd7vw\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:30 crc kubenswrapper[4873]: I0219 10:31:30.041073 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-catalog-content\") pod \"community-operators-rd7vw\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:30 crc kubenswrapper[4873]: I0219 10:31:30.041078 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-utilities\") pod \"community-operators-rd7vw\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:30 crc kubenswrapper[4873]: I0219 10:31:30.061124 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vst\" (UniqueName: \"kubernetes.io/projected/4ae001dc-8355-422d-909f-d7eb1f4e80fe-kube-api-access-44vst\") pod \"community-operators-rd7vw\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:30 crc kubenswrapper[4873]: I0219 10:31:30.149503 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:30 crc kubenswrapper[4873]: I0219 10:31:30.723373 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rd7vw"] Feb 19 10:31:31 crc kubenswrapper[4873]: I0219 10:31:31.110930 4873 generic.go:334] "Generic (PLEG): container finished" podID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" containerID="0727d3766c2b69920ee16cacf47fa390d5337877553fd613b0bde11e96028bc8" exitCode=0 Feb 19 10:31:31 crc kubenswrapper[4873]: I0219 10:31:31.111113 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd7vw" event={"ID":"4ae001dc-8355-422d-909f-d7eb1f4e80fe","Type":"ContainerDied","Data":"0727d3766c2b69920ee16cacf47fa390d5337877553fd613b0bde11e96028bc8"} Feb 19 10:31:31 crc kubenswrapper[4873]: I0219 10:31:31.111356 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd7vw" event={"ID":"4ae001dc-8355-422d-909f-d7eb1f4e80fe","Type":"ContainerStarted","Data":"4b77c8207015a73521213a8d125fc1fc57465ec7ba0af072e6fc7af04a1b9a40"} Feb 19 10:31:31 crc kubenswrapper[4873]: I0219 10:31:31.113963 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 10:31:32 crc kubenswrapper[4873]: I0219 10:31:32.120494 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd7vw" event={"ID":"4ae001dc-8355-422d-909f-d7eb1f4e80fe","Type":"ContainerStarted","Data":"cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174"} Feb 19 10:31:34 crc kubenswrapper[4873]: I0219 10:31:34.138899 4873 generic.go:334] "Generic (PLEG): container finished" podID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" containerID="cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174" exitCode=0 Feb 19 10:31:34 crc kubenswrapper[4873]: I0219 10:31:34.138978 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd7vw" event={"ID":"4ae001dc-8355-422d-909f-d7eb1f4e80fe","Type":"ContainerDied","Data":"cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174"} Feb 19 10:31:35 crc kubenswrapper[4873]: I0219 10:31:35.150243 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd7vw" event={"ID":"4ae001dc-8355-422d-909f-d7eb1f4e80fe","Type":"ContainerStarted","Data":"f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a"} Feb 19 10:31:35 crc kubenswrapper[4873]: I0219 10:31:35.170868 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rd7vw" podStartSLOduration=2.480003354 podStartE2EDuration="6.17084718s" podCreationTimestamp="2026-02-19 10:31:29 +0000 UTC" firstStartedPulling="2026-02-19 10:31:31.113758509 +0000 UTC m=+2800.403190147" lastFinishedPulling="2026-02-19 10:31:34.804602335 +0000 UTC m=+2804.094033973" observedRunningTime="2026-02-19 10:31:35.164046599 +0000 UTC m=+2804.453478237" watchObservedRunningTime="2026-02-19 10:31:35.17084718 +0000 UTC m=+2804.460278818" Feb 19 10:31:40 crc kubenswrapper[4873]: I0219 10:31:40.149747 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:40 crc kubenswrapper[4873]: I0219 10:31:40.150299 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:40 crc kubenswrapper[4873]: I0219 10:31:40.209515 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:40 crc kubenswrapper[4873]: I0219 10:31:40.284598 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:40 crc kubenswrapper[4873]: I0219 10:31:40.451432 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rd7vw"] Feb 19 10:31:42 crc kubenswrapper[4873]: I0219 10:31:42.224023 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rd7vw" podUID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" containerName="registry-server" containerID="cri-o://f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a" gracePeriod=2 Feb 19 10:31:42 crc kubenswrapper[4873]: I0219 10:31:42.707385 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:42 crc kubenswrapper[4873]: I0219 10:31:42.820846 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-catalog-content\") pod \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " Feb 19 10:31:42 crc kubenswrapper[4873]: I0219 10:31:42.821148 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-utilities\") pod \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " Feb 19 10:31:42 crc kubenswrapper[4873]: I0219 10:31:42.821199 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44vst\" (UniqueName: \"kubernetes.io/projected/4ae001dc-8355-422d-909f-d7eb1f4e80fe-kube-api-access-44vst\") pod \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\" (UID: \"4ae001dc-8355-422d-909f-d7eb1f4e80fe\") " Feb 19 10:31:42 crc kubenswrapper[4873]: I0219 10:31:42.821751 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-utilities" (OuterVolumeSpecName: "utilities") pod "4ae001dc-8355-422d-909f-d7eb1f4e80fe" (UID: "4ae001dc-8355-422d-909f-d7eb1f4e80fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:31:42 crc kubenswrapper[4873]: I0219 10:31:42.836473 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae001dc-8355-422d-909f-d7eb1f4e80fe-kube-api-access-44vst" (OuterVolumeSpecName: "kube-api-access-44vst") pod "4ae001dc-8355-422d-909f-d7eb1f4e80fe" (UID: "4ae001dc-8355-422d-909f-d7eb1f4e80fe"). InnerVolumeSpecName "kube-api-access-44vst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:31:42 crc kubenswrapper[4873]: I0219 10:31:42.880464 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ae001dc-8355-422d-909f-d7eb1f4e80fe" (UID: "4ae001dc-8355-422d-909f-d7eb1f4e80fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:31:42 crc kubenswrapper[4873]: I0219 10:31:42.924849 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44vst\" (UniqueName: \"kubernetes.io/projected/4ae001dc-8355-422d-909f-d7eb1f4e80fe-kube-api-access-44vst\") on node \"crc\" DevicePath \"\"" Feb 19 10:31:42 crc kubenswrapper[4873]: I0219 10:31:42.924905 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:31:42 crc kubenswrapper[4873]: I0219 10:31:42.924920 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae001dc-8355-422d-909f-d7eb1f4e80fe-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.234985 4873 generic.go:334] "Generic (PLEG): container finished" podID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" containerID="f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a" exitCode=0 Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.235024 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd7vw" event={"ID":"4ae001dc-8355-422d-909f-d7eb1f4e80fe","Type":"ContainerDied","Data":"f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a"} Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.235054 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rd7vw" event={"ID":"4ae001dc-8355-422d-909f-d7eb1f4e80fe","Type":"ContainerDied","Data":"4b77c8207015a73521213a8d125fc1fc57465ec7ba0af072e6fc7af04a1b9a40"} Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.235073 4873 scope.go:117] "RemoveContainer" containerID="f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a" Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.235080 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rd7vw" Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.256297 4873 scope.go:117] "RemoveContainer" containerID="cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174" Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.291176 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rd7vw"] Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.307485 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rd7vw"] Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.323581 4873 scope.go:117] "RemoveContainer" containerID="0727d3766c2b69920ee16cacf47fa390d5337877553fd613b0bde11e96028bc8" Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.350358 4873 scope.go:117] "RemoveContainer" containerID="f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a" Feb 19 10:31:43 crc kubenswrapper[4873]: E0219 10:31:43.350853 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a\": container with ID starting with f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a not found: ID does not exist" containerID="f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a" Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.350889 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a"} err="failed to get container status \"f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a\": rpc error: code = NotFound desc = could not find container \"f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a\": container with ID starting with f09f5abde9388e17663169e87798acd45cbb400c6bc4af9c08536f553182545a not found: ID does not exist" Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.350910 4873 scope.go:117] "RemoveContainer" containerID="cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174" Feb 19 10:31:43 crc kubenswrapper[4873]: E0219 10:31:43.351262 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174\": container with ID starting with cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174 not found: ID does not exist" containerID="cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174" Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.351290 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174"} err="failed to get container status \"cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174\": rpc error: code = NotFound desc = could not find container \"cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174\": container with ID starting with cd25869832d3ec5c774490990580f20eda364ccd36b96508fe7b8c2414ec8174 not found: ID does not exist" Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.351306 4873 scope.go:117] "RemoveContainer" containerID="0727d3766c2b69920ee16cacf47fa390d5337877553fd613b0bde11e96028bc8" Feb 19 10:31:43 crc kubenswrapper[4873]: E0219 10:31:43.351560 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0727d3766c2b69920ee16cacf47fa390d5337877553fd613b0bde11e96028bc8\": container with ID starting with 0727d3766c2b69920ee16cacf47fa390d5337877553fd613b0bde11e96028bc8 not found: ID does not exist" containerID="0727d3766c2b69920ee16cacf47fa390d5337877553fd613b0bde11e96028bc8" Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.351609 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0727d3766c2b69920ee16cacf47fa390d5337877553fd613b0bde11e96028bc8"} err="failed to get container status \"0727d3766c2b69920ee16cacf47fa390d5337877553fd613b0bde11e96028bc8\": rpc error: code = NotFound desc = could not find container \"0727d3766c2b69920ee16cacf47fa390d5337877553fd613b0bde11e96028bc8\": container with ID starting with 0727d3766c2b69920ee16cacf47fa390d5337877553fd613b0bde11e96028bc8 not found: ID does not exist" Feb 19 10:31:43 crc kubenswrapper[4873]: I0219 10:31:43.495799 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" path="/var/lib/kubelet/pods/4ae001dc-8355-422d-909f-d7eb1f4e80fe/volumes" Feb 19 10:32:18 crc kubenswrapper[4873]: I0219 10:32:18.241031 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:32:18 crc kubenswrapper[4873]: I0219 10:32:18.241687 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:32:48 crc kubenswrapper[4873]: I0219 10:32:48.240750 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:32:48 crc kubenswrapper[4873]: I0219 10:32:48.241367 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.171888 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-82qzr"] Feb 19 10:32:54 crc kubenswrapper[4873]: E0219 10:32:54.174023 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" containerName="registry-server" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.174069 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" containerName="registry-server" Feb 19 10:32:54 crc kubenswrapper[4873]: E0219 10:32:54.174094 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" containerName="extract-utilities" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.174142 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" containerName="extract-utilities" Feb 19 10:32:54 crc kubenswrapper[4873]: E0219 10:32:54.174165 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" containerName="extract-content" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.174175 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" containerName="extract-content" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.176049 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae001dc-8355-422d-909f-d7eb1f4e80fe" containerName="registry-server" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.179609 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.192420 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-82qzr"] Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.328822 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgnpq\" (UniqueName: \"kubernetes.io/projected/78f117c0-1029-4b43-ab4c-486312acf531-kube-api-access-kgnpq\") pod \"redhat-operators-82qzr\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.328926 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-catalog-content\") pod \"redhat-operators-82qzr\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.329053 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-utilities\") pod \"redhat-operators-82qzr\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.431339 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-catalog-content\") pod \"redhat-operators-82qzr\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.431517 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-utilities\") pod \"redhat-operators-82qzr\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.431646 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgnpq\" (UniqueName: \"kubernetes.io/projected/78f117c0-1029-4b43-ab4c-486312acf531-kube-api-access-kgnpq\") pod \"redhat-operators-82qzr\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.431914 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-catalog-content\") pod \"redhat-operators-82qzr\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.431951 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-utilities\") pod \"redhat-operators-82qzr\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.459084 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgnpq\" (UniqueName: \"kubernetes.io/projected/78f117c0-1029-4b43-ab4c-486312acf531-kube-api-access-kgnpq\") pod \"redhat-operators-82qzr\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:32:54 crc kubenswrapper[4873]: I0219 10:32:54.520999 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:32:55 crc kubenswrapper[4873]: I0219 10:32:55.031765 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-82qzr"] Feb 19 10:32:55 crc kubenswrapper[4873]: I0219 10:32:55.937633 4873 generic.go:334] "Generic (PLEG): container finished" podID="78f117c0-1029-4b43-ab4c-486312acf531" containerID="20bead4efa3d8df668099d6f92638ebd2934b3a50e58e9767728bce58454d134" exitCode=0 Feb 19 10:32:55 crc kubenswrapper[4873]: I0219 10:32:55.937704 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82qzr" event={"ID":"78f117c0-1029-4b43-ab4c-486312acf531","Type":"ContainerDied","Data":"20bead4efa3d8df668099d6f92638ebd2934b3a50e58e9767728bce58454d134"} Feb 19 10:32:55 crc kubenswrapper[4873]: I0219 10:32:55.937916 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82qzr" event={"ID":"78f117c0-1029-4b43-ab4c-486312acf531","Type":"ContainerStarted","Data":"23eeef1ed125c5cde8dc4bc331e27cdd6957f05bc2af3732d543cd4f4e9e6072"} Feb 19 10:32:57 crc kubenswrapper[4873]: I0219 10:32:57.965121 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82qzr" event={"ID":"78f117c0-1029-4b43-ab4c-486312acf531","Type":"ContainerStarted","Data":"8fc2239af333bdc91dd32de0cb27c8416c88fd0a92fe40bc7d18bcc44f17dba7"} Feb 19 10:33:03 crc kubenswrapper[4873]: I0219 10:33:03.012885 4873 generic.go:334] "Generic (PLEG): container finished" podID="78f117c0-1029-4b43-ab4c-486312acf531" containerID="8fc2239af333bdc91dd32de0cb27c8416c88fd0a92fe40bc7d18bcc44f17dba7" exitCode=0 Feb 19 10:33:03 crc kubenswrapper[4873]: I0219 10:33:03.012981 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82qzr" event={"ID":"78f117c0-1029-4b43-ab4c-486312acf531","Type":"ContainerDied","Data":"8fc2239af333bdc91dd32de0cb27c8416c88fd0a92fe40bc7d18bcc44f17dba7"} Feb 19 10:33:04 crc kubenswrapper[4873]: I0219 10:33:04.026621 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82qzr" event={"ID":"78f117c0-1029-4b43-ab4c-486312acf531","Type":"ContainerStarted","Data":"1784bbce73148192c9784e04f030b74889d805efeb3cb76e27e0dcf0b45ea58b"} Feb 19 10:33:04 crc kubenswrapper[4873]: I0219 10:33:04.046837 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-82qzr" podStartSLOduration=2.540697879 podStartE2EDuration="10.046818345s" podCreationTimestamp="2026-02-19 10:32:54 +0000 UTC" firstStartedPulling="2026-02-19 10:32:55.940594934 +0000 UTC m=+2885.230026572" lastFinishedPulling="2026-02-19 10:33:03.4467154 +0000 UTC m=+2892.736147038" observedRunningTime="2026-02-19 10:33:04.046182589 +0000 UTC m=+2893.335614237" watchObservedRunningTime="2026-02-19 10:33:04.046818345 +0000 UTC m=+2893.336249983" Feb 19 10:33:04 crc kubenswrapper[4873]: I0219 10:33:04.521847 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:33:04 crc kubenswrapper[4873]: I0219 10:33:04.522287 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:33:05 crc kubenswrapper[4873]: I0219 10:33:05.570356 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-82qzr" podUID="78f117c0-1029-4b43-ab4c-486312acf531" containerName="registry-server" probeResult="failure" output=< Feb 19 10:33:05 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 10:33:05 crc kubenswrapper[4873]: > Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.152374 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2jmxn"] Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.157164 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.161920 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jmxn"] Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.285529 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-catalog-content\") pod \"redhat-marketplace-2jmxn\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.285862 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj6ws\" (UniqueName: \"kubernetes.io/projected/0661a6c4-6ace-47e5-b3de-bcee0bda9714-kube-api-access-kj6ws\") pod \"redhat-marketplace-2jmxn\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.286225 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-utilities\") pod \"redhat-marketplace-2jmxn\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.387927 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj6ws\" (UniqueName: \"kubernetes.io/projected/0661a6c4-6ace-47e5-b3de-bcee0bda9714-kube-api-access-kj6ws\") pod \"redhat-marketplace-2jmxn\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.388073 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-utilities\") pod \"redhat-marketplace-2jmxn\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.388175 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-catalog-content\") pod \"redhat-marketplace-2jmxn\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.388837 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-utilities\") pod \"redhat-marketplace-2jmxn\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.388858 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-catalog-content\") pod \"redhat-marketplace-2jmxn\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.410980 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj6ws\" (UniqueName: \"kubernetes.io/projected/0661a6c4-6ace-47e5-b3de-bcee0bda9714-kube-api-access-kj6ws\") pod \"redhat-marketplace-2jmxn\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:10 crc kubenswrapper[4873]: I0219 10:33:10.505562 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:11 crc kubenswrapper[4873]: I0219 10:33:11.026537 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jmxn"] Feb 19 10:33:11 crc kubenswrapper[4873]: I0219 10:33:11.098016 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jmxn" event={"ID":"0661a6c4-6ace-47e5-b3de-bcee0bda9714","Type":"ContainerStarted","Data":"3d648f78e1bea540f365ed9db4323aba5f331a0de290c0949ac76e5f5c3d0283"} Feb 19 10:33:12 crc kubenswrapper[4873]: I0219 10:33:12.110333 4873 generic.go:334] "Generic (PLEG): container finished" podID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" containerID="e89c20b48341f8f16f1a8dca5bcf3723a43600c0e43831896fa283627ac561d7" exitCode=0 Feb 19 10:33:12 crc kubenswrapper[4873]: I0219 10:33:12.110419 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jmxn" event={"ID":"0661a6c4-6ace-47e5-b3de-bcee0bda9714","Type":"ContainerDied","Data":"e89c20b48341f8f16f1a8dca5bcf3723a43600c0e43831896fa283627ac561d7"} Feb 19 10:33:13 crc kubenswrapper[4873]: I0219 10:33:13.131478 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jmxn" event={"ID":"0661a6c4-6ace-47e5-b3de-bcee0bda9714","Type":"ContainerStarted","Data":"58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55"} Feb 19 10:33:14 crc kubenswrapper[4873]: I0219 10:33:14.151500 4873 generic.go:334] "Generic (PLEG): container finished" podID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" containerID="58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55" exitCode=0 Feb 19 10:33:14 crc kubenswrapper[4873]: I0219 10:33:14.151572 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jmxn" event={"ID":"0661a6c4-6ace-47e5-b3de-bcee0bda9714","Type":"ContainerDied","Data":"58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55"} Feb 19 10:33:14 crc kubenswrapper[4873]: I0219 10:33:14.571821 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:33:14 crc kubenswrapper[4873]: I0219 10:33:14.638085 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:33:15 crc kubenswrapper[4873]: I0219 10:33:15.165287 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jmxn" event={"ID":"0661a6c4-6ace-47e5-b3de-bcee0bda9714","Type":"ContainerStarted","Data":"2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45"} Feb 19 10:33:15 crc kubenswrapper[4873]: I0219 10:33:15.189688 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2jmxn" podStartSLOduration=2.764588609 podStartE2EDuration="5.189670107s" podCreationTimestamp="2026-02-19 10:33:10 +0000 UTC" firstStartedPulling="2026-02-19 10:33:12.113673058 +0000 UTC m=+2901.403104696" lastFinishedPulling="2026-02-19 10:33:14.538754556 +0000 UTC m=+2903.828186194" observedRunningTime="2026-02-19 10:33:15.184786314 +0000 UTC m=+2904.474217952" watchObservedRunningTime="2026-02-19 10:33:15.189670107 +0000 UTC m=+2904.479101735" Feb 19 10:33:16 crc kubenswrapper[4873]: I0219 10:33:16.933712 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-82qzr"] Feb 19 10:33:16 crc kubenswrapper[4873]: I0219 10:33:16.935870 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-82qzr" podUID="78f117c0-1029-4b43-ab4c-486312acf531" containerName="registry-server" containerID="cri-o://1784bbce73148192c9784e04f030b74889d805efeb3cb76e27e0dcf0b45ea58b" gracePeriod=2 Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.192986 4873 generic.go:334] "Generic (PLEG): container finished" podID="78f117c0-1029-4b43-ab4c-486312acf531" containerID="1784bbce73148192c9784e04f030b74889d805efeb3cb76e27e0dcf0b45ea58b" exitCode=0 Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.193281 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82qzr" event={"ID":"78f117c0-1029-4b43-ab4c-486312acf531","Type":"ContainerDied","Data":"1784bbce73148192c9784e04f030b74889d805efeb3cb76e27e0dcf0b45ea58b"} Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.400732 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.455825 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-catalog-content\") pod \"78f117c0-1029-4b43-ab4c-486312acf531\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.455904 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgnpq\" (UniqueName: \"kubernetes.io/projected/78f117c0-1029-4b43-ab4c-486312acf531-kube-api-access-kgnpq\") pod \"78f117c0-1029-4b43-ab4c-486312acf531\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.455937 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-utilities\") pod \"78f117c0-1029-4b43-ab4c-486312acf531\" (UID: \"78f117c0-1029-4b43-ab4c-486312acf531\") " Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.456889 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-utilities" (OuterVolumeSpecName: "utilities") pod "78f117c0-1029-4b43-ab4c-486312acf531" (UID: "78f117c0-1029-4b43-ab4c-486312acf531"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.477000 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f117c0-1029-4b43-ab4c-486312acf531-kube-api-access-kgnpq" (OuterVolumeSpecName: "kube-api-access-kgnpq") pod "78f117c0-1029-4b43-ab4c-486312acf531" (UID: "78f117c0-1029-4b43-ab4c-486312acf531"). InnerVolumeSpecName "kube-api-access-kgnpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.559633 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgnpq\" (UniqueName: \"kubernetes.io/projected/78f117c0-1029-4b43-ab4c-486312acf531-kube-api-access-kgnpq\") on node \"crc\" DevicePath \"\"" Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.559671 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.590859 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78f117c0-1029-4b43-ab4c-486312acf531" (UID: "78f117c0-1029-4b43-ab4c-486312acf531"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:33:17 crc kubenswrapper[4873]: I0219 10:33:17.661409 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f117c0-1029-4b43-ab4c-486312acf531-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.204475 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-82qzr" event={"ID":"78f117c0-1029-4b43-ab4c-486312acf531","Type":"ContainerDied","Data":"23eeef1ed125c5cde8dc4bc331e27cdd6957f05bc2af3732d543cd4f4e9e6072"} Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.204549 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-82qzr" Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.204821 4873 scope.go:117] "RemoveContainer" containerID="1784bbce73148192c9784e04f030b74889d805efeb3cb76e27e0dcf0b45ea58b" Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.227715 4873 scope.go:117] "RemoveContainer" containerID="8fc2239af333bdc91dd32de0cb27c8416c88fd0a92fe40bc7d18bcc44f17dba7" Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.240808 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.240858 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.240907 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.241665 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.241723 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" gracePeriod=600 Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.242090 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-82qzr"] Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.254633 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-82qzr"] Feb 19 10:33:18 crc kubenswrapper[4873]: I0219 10:33:18.268851 4873 scope.go:117] "RemoveContainer" containerID="20bead4efa3d8df668099d6f92638ebd2934b3a50e58e9767728bce58454d134" Feb 19 10:33:18 crc kubenswrapper[4873]: E0219 10:33:18.379071 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:33:19 crc kubenswrapper[4873]: I0219 10:33:19.217215 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" exitCode=0 Feb 19 10:33:19 crc kubenswrapper[4873]: I0219 10:33:19.217285 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b"} Feb 19 10:33:19 crc kubenswrapper[4873]: I0219 10:33:19.217554 4873 scope.go:117] "RemoveContainer" containerID="edec24981d97e1beda63a2d9013b9abdb1a1dbeed2c76ab65161659d51d3be20" Feb 19 10:33:19 crc kubenswrapper[4873]: I0219 10:33:19.218243 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:33:19 crc kubenswrapper[4873]: E0219 10:33:19.218594 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:33:19 crc kubenswrapper[4873]: I0219 10:33:19.495312 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f117c0-1029-4b43-ab4c-486312acf531" path="/var/lib/kubelet/pods/78f117c0-1029-4b43-ab4c-486312acf531/volumes" Feb 19 10:33:20 crc kubenswrapper[4873]: I0219 10:33:20.506040 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:20 crc kubenswrapper[4873]: I0219 10:33:20.506343 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:20 crc kubenswrapper[4873]: I0219 10:33:20.552913 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:21 crc kubenswrapper[4873]: I0219 10:33:21.305818 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:21 crc kubenswrapper[4873]: I0219 10:33:21.719062 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jmxn"] Feb 19 10:33:23 crc kubenswrapper[4873]: I0219 10:33:23.264604 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2jmxn" podUID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" containerName="registry-server" containerID="cri-o://2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45" gracePeriod=2 Feb 19 10:33:23 crc kubenswrapper[4873]: I0219 10:33:23.730798 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:23 crc kubenswrapper[4873]: I0219 10:33:23.784698 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-catalog-content\") pod \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " Feb 19 10:33:23 crc kubenswrapper[4873]: I0219 10:33:23.784772 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-utilities\") pod \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " Feb 19 10:33:23 crc kubenswrapper[4873]: I0219 10:33:23.784827 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj6ws\" (UniqueName: \"kubernetes.io/projected/0661a6c4-6ace-47e5-b3de-bcee0bda9714-kube-api-access-kj6ws\") pod \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\" (UID: \"0661a6c4-6ace-47e5-b3de-bcee0bda9714\") " Feb 19 10:33:23 crc kubenswrapper[4873]: I0219 10:33:23.787094 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-utilities" (OuterVolumeSpecName: "utilities") pod "0661a6c4-6ace-47e5-b3de-bcee0bda9714" (UID: "0661a6c4-6ace-47e5-b3de-bcee0bda9714"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:33:23 crc kubenswrapper[4873]: I0219 10:33:23.791781 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0661a6c4-6ace-47e5-b3de-bcee0bda9714-kube-api-access-kj6ws" (OuterVolumeSpecName: "kube-api-access-kj6ws") pod "0661a6c4-6ace-47e5-b3de-bcee0bda9714" (UID: "0661a6c4-6ace-47e5-b3de-bcee0bda9714"). InnerVolumeSpecName "kube-api-access-kj6ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:33:23 crc kubenswrapper[4873]: I0219 10:33:23.810541 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0661a6c4-6ace-47e5-b3de-bcee0bda9714" (UID: "0661a6c4-6ace-47e5-b3de-bcee0bda9714"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:33:23 crc kubenswrapper[4873]: I0219 10:33:23.887186 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:33:23 crc kubenswrapper[4873]: I0219 10:33:23.887524 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0661a6c4-6ace-47e5-b3de-bcee0bda9714-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:33:23 crc kubenswrapper[4873]: I0219 10:33:23.887536 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj6ws\" (UniqueName: \"kubernetes.io/projected/0661a6c4-6ace-47e5-b3de-bcee0bda9714-kube-api-access-kj6ws\") on node \"crc\" DevicePath \"\"" Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.275662 4873 generic.go:334] "Generic (PLEG): container finished" podID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" containerID="2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45" exitCode=0 Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.275705 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jmxn" event={"ID":"0661a6c4-6ace-47e5-b3de-bcee0bda9714","Type":"ContainerDied","Data":"2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45"} Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.275729 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jmxn" event={"ID":"0661a6c4-6ace-47e5-b3de-bcee0bda9714","Type":"ContainerDied","Data":"3d648f78e1bea540f365ed9db4323aba5f331a0de290c0949ac76e5f5c3d0283"} Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.275738 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jmxn" Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.275745 4873 scope.go:117] "RemoveContainer" containerID="2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45" Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.306075 4873 scope.go:117] "RemoveContainer" containerID="58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55" Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.337253 4873 scope.go:117] "RemoveContainer" containerID="e89c20b48341f8f16f1a8dca5bcf3723a43600c0e43831896fa283627ac561d7" Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.337636 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jmxn"] Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.350515 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jmxn"] Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.384987 4873 scope.go:117] "RemoveContainer" containerID="2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45" Feb 19 10:33:24 crc kubenswrapper[4873]: E0219 10:33:24.385678 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45\": container with ID starting with 2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45 not found: ID does not exist" containerID="2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45" Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.385712 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45"} err="failed to get container status \"2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45\": rpc error: code = NotFound desc = could not find container \"2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45\": container with ID starting with 2cec4414aa0c1ffea81367f46dfe60a7afc0ca4a4aaf7c2b3439fdb480eebe45 not found: ID does not exist" Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.385734 4873 scope.go:117] "RemoveContainer" containerID="58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55" Feb 19 10:33:24 crc kubenswrapper[4873]: E0219 10:33:24.386006 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55\": container with ID starting with 58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55 not found: ID does not exist" containerID="58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55" Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.386021 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55"} err="failed to get container status \"58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55\": rpc error: code = NotFound desc = could not find container \"58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55\": container with ID starting with 58616f1a828785e989384245e1f4513a0f70225ce36a2f5c5b7eef196afe6a55 not found: ID does not exist" Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.386034 4873 scope.go:117] "RemoveContainer" containerID="e89c20b48341f8f16f1a8dca5bcf3723a43600c0e43831896fa283627ac561d7" Feb 19 10:33:24 crc kubenswrapper[4873]: E0219 10:33:24.386271 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e89c20b48341f8f16f1a8dca5bcf3723a43600c0e43831896fa283627ac561d7\": container with ID starting with e89c20b48341f8f16f1a8dca5bcf3723a43600c0e43831896fa283627ac561d7 not found: ID does not exist" containerID="e89c20b48341f8f16f1a8dca5bcf3723a43600c0e43831896fa283627ac561d7" Feb 19 10:33:24 crc kubenswrapper[4873]: I0219 10:33:24.386288 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e89c20b48341f8f16f1a8dca5bcf3723a43600c0e43831896fa283627ac561d7"} err="failed to get container status \"e89c20b48341f8f16f1a8dca5bcf3723a43600c0e43831896fa283627ac561d7\": rpc error: code = NotFound desc = could not find container \"e89c20b48341f8f16f1a8dca5bcf3723a43600c0e43831896fa283627ac561d7\": container with ID starting with e89c20b48341f8f16f1a8dca5bcf3723a43600c0e43831896fa283627ac561d7 not found: ID does not exist" Feb 19 10:33:25 crc kubenswrapper[4873]: I0219 10:33:25.498156 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" path="/var/lib/kubelet/pods/0661a6c4-6ace-47e5-b3de-bcee0bda9714/volumes" Feb 19 10:33:32 crc kubenswrapper[4873]: I0219 10:33:32.484716 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:33:32 crc kubenswrapper[4873]: E0219 10:33:32.485561 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:33:47 crc kubenswrapper[4873]: I0219 10:33:47.485258 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:33:47 crc kubenswrapper[4873]: E0219 10:33:47.486000 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:33:58 crc kubenswrapper[4873]: I0219 10:33:58.484933 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:33:58 crc kubenswrapper[4873]: E0219 10:33:58.485750 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:34:11 crc kubenswrapper[4873]: I0219 10:34:11.493409 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:34:11 crc kubenswrapper[4873]: E0219 10:34:11.494451 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:34:23 crc kubenswrapper[4873]: I0219 10:34:23.485557 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:34:23 crc kubenswrapper[4873]: E0219 10:34:23.486550 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:34:36 crc kubenswrapper[4873]: I0219 10:34:36.484048 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:34:36 crc kubenswrapper[4873]: E0219 10:34:36.485232 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:34:48 crc kubenswrapper[4873]: I0219 10:34:48.484869 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:34:48 crc kubenswrapper[4873]: E0219 10:34:48.485579 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:35:00 crc kubenswrapper[4873]: I0219 10:35:00.484650 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:35:00 crc kubenswrapper[4873]: E0219 10:35:00.485646 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:35:14 crc kubenswrapper[4873]: I0219 10:35:14.484633 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:35:14 crc kubenswrapper[4873]: E0219 10:35:14.485274 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:35:26 crc kubenswrapper[4873]: I0219 10:35:26.485809 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:35:26 crc kubenswrapper[4873]: E0219 10:35:26.486508 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:35:41 crc kubenswrapper[4873]: I0219 10:35:41.492114 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:35:41 crc kubenswrapper[4873]: E0219 10:35:41.492954 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:35:53 crc kubenswrapper[4873]: I0219 10:35:53.484212 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:35:53 crc kubenswrapper[4873]: E0219 10:35:53.484939 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:36:07 crc kubenswrapper[4873]: I0219 10:36:07.484801 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:36:07 crc kubenswrapper[4873]: E0219 10:36:07.487281 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:36:19 crc kubenswrapper[4873]: I0219 10:36:19.485151 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:36:19 crc kubenswrapper[4873]: E0219 10:36:19.485967 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:36:32 crc kubenswrapper[4873]: I0219 10:36:32.484859 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:36:32 crc kubenswrapper[4873]: E0219 10:36:32.485591 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:36:44 crc kubenswrapper[4873]: I0219 10:36:44.484695 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:36:44 crc kubenswrapper[4873]: E0219 10:36:44.485501 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.534646 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xvs9k"] Feb 19 10:36:45 crc kubenswrapper[4873]: E0219 10:36:45.535157 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f117c0-1029-4b43-ab4c-486312acf531" containerName="extract-content" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.535174 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f117c0-1029-4b43-ab4c-486312acf531" containerName="extract-content" Feb 19 10:36:45 crc kubenswrapper[4873]: E0219 10:36:45.535218 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" containerName="registry-server" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.535234 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" containerName="registry-server" Feb 19 10:36:45 crc kubenswrapper[4873]: E0219 10:36:45.535254 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f117c0-1029-4b43-ab4c-486312acf531" containerName="extract-utilities" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.535263 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f117c0-1029-4b43-ab4c-486312acf531" containerName="extract-utilities" Feb 19 10:36:45 crc kubenswrapper[4873]: E0219 10:36:45.535280 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f117c0-1029-4b43-ab4c-486312acf531" containerName="registry-server" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.535287 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f117c0-1029-4b43-ab4c-486312acf531" containerName="registry-server" Feb 19 10:36:45 crc kubenswrapper[4873]: E0219 10:36:45.535302 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" containerName="extract-content" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.535308 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" containerName="extract-content" Feb 19 10:36:45 crc kubenswrapper[4873]: E0219 10:36:45.535330 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" containerName="extract-utilities" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.535337 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" containerName="extract-utilities" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.535610 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f117c0-1029-4b43-ab4c-486312acf531" containerName="registry-server" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.535635 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0661a6c4-6ace-47e5-b3de-bcee0bda9714" containerName="registry-server" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.538440 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.572565 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xvs9k"] Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.691633 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-utilities\") pod \"certified-operators-xvs9k\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.692190 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlm9n\" (UniqueName: \"kubernetes.io/projected/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-kube-api-access-vlm9n\") pod \"certified-operators-xvs9k\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.692434 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-catalog-content\") pod \"certified-operators-xvs9k\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.794707 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlm9n\" (UniqueName: \"kubernetes.io/projected/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-kube-api-access-vlm9n\") pod \"certified-operators-xvs9k\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.794827 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-catalog-content\") pod \"certified-operators-xvs9k\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.794979 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-utilities\") pod \"certified-operators-xvs9k\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.795383 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-catalog-content\") pod \"certified-operators-xvs9k\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.795559 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-utilities\") pod \"certified-operators-xvs9k\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.815357 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlm9n\" (UniqueName: \"kubernetes.io/projected/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-kube-api-access-vlm9n\") pod \"certified-operators-xvs9k\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:45 crc kubenswrapper[4873]: I0219 10:36:45.862934 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:46 crc kubenswrapper[4873]: I0219 10:36:46.423070 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xvs9k"] Feb 19 10:36:46 crc kubenswrapper[4873]: I0219 10:36:46.730047 4873 generic.go:334] "Generic (PLEG): container finished" podID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" containerID="595867504c3014b4b9ad93c216f9e475546590ca7b34fa0cf9a413a0376f5f5e" exitCode=0 Feb 19 10:36:46 crc kubenswrapper[4873]: I0219 10:36:46.730216 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvs9k" event={"ID":"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd","Type":"ContainerDied","Data":"595867504c3014b4b9ad93c216f9e475546590ca7b34fa0cf9a413a0376f5f5e"} Feb 19 10:36:46 crc kubenswrapper[4873]: I0219 10:36:46.730406 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvs9k" event={"ID":"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd","Type":"ContainerStarted","Data":"dabe8dc4309c8bccdc31acf5b6a881c32a5eaff06ffd335baa46a3af43b8b798"} Feb 19 10:36:46 crc kubenswrapper[4873]: I0219 10:36:46.732083 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 10:36:48 crc kubenswrapper[4873]: I0219 10:36:48.749067 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvs9k" event={"ID":"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd","Type":"ContainerStarted","Data":"195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821"} Feb 19 10:36:49 crc kubenswrapper[4873]: I0219 10:36:49.759013 4873 generic.go:334] "Generic (PLEG): container finished" podID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" containerID="195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821" exitCode=0 Feb 19 10:36:49 crc kubenswrapper[4873]: I0219 10:36:49.759262 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvs9k" event={"ID":"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd","Type":"ContainerDied","Data":"195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821"} Feb 19 10:36:50 crc kubenswrapper[4873]: I0219 10:36:50.778994 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvs9k" event={"ID":"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd","Type":"ContainerStarted","Data":"bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf"} Feb 19 10:36:50 crc kubenswrapper[4873]: I0219 10:36:50.804306 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xvs9k" podStartSLOduration=2.396104145 podStartE2EDuration="5.804288137s" podCreationTimestamp="2026-02-19 10:36:45 +0000 UTC" firstStartedPulling="2026-02-19 10:36:46.731807013 +0000 UTC m=+3116.021238651" lastFinishedPulling="2026-02-19 10:36:50.139991005 +0000 UTC m=+3119.429422643" observedRunningTime="2026-02-19 10:36:50.79647196 +0000 UTC m=+3120.085903598" watchObservedRunningTime="2026-02-19 10:36:50.804288137 +0000 UTC m=+3120.093719775" Feb 19 10:36:55 crc kubenswrapper[4873]: I0219 10:36:55.484958 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:36:55 crc kubenswrapper[4873]: E0219 10:36:55.486349 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:36:55 crc kubenswrapper[4873]: I0219 10:36:55.863353 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:55 crc kubenswrapper[4873]: I0219 10:36:55.863777 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:55 crc kubenswrapper[4873]: I0219 10:36:55.918056 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:56 crc kubenswrapper[4873]: I0219 10:36:56.884165 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:56 crc kubenswrapper[4873]: I0219 10:36:56.943783 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xvs9k"] Feb 19 10:36:58 crc kubenswrapper[4873]: I0219 10:36:58.850249 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xvs9k" podUID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" containerName="registry-server" containerID="cri-o://bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf" gracePeriod=2 Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.343858 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.502758 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-utilities\") pod \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.502801 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlm9n\" (UniqueName: \"kubernetes.io/projected/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-kube-api-access-vlm9n\") pod \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.502962 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-catalog-content\") pod \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\" (UID: \"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd\") " Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.506905 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-utilities" (OuterVolumeSpecName: "utilities") pod "70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" (UID: "70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.512478 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-kube-api-access-vlm9n" (OuterVolumeSpecName: "kube-api-access-vlm9n") pod "70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" (UID: "70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd"). InnerVolumeSpecName "kube-api-access-vlm9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.555040 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" (UID: "70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.606572 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.606849 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.606861 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlm9n\" (UniqueName: \"kubernetes.io/projected/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd-kube-api-access-vlm9n\") on node \"crc\" DevicePath \"\"" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.863115 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvs9k" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.863169 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvs9k" event={"ID":"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd","Type":"ContainerDied","Data":"bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf"} Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.863241 4873 scope.go:117] "RemoveContainer" containerID="bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.862968 4873 generic.go:334] "Generic (PLEG): container finished" podID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" containerID="bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf" exitCode=0 Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.864320 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvs9k" event={"ID":"70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd","Type":"ContainerDied","Data":"dabe8dc4309c8bccdc31acf5b6a881c32a5eaff06ffd335baa46a3af43b8b798"} Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.892618 4873 scope.go:117] "RemoveContainer" containerID="195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.906381 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xvs9k"] Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.915333 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xvs9k"] Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.928964 4873 scope.go:117] "RemoveContainer" containerID="595867504c3014b4b9ad93c216f9e475546590ca7b34fa0cf9a413a0376f5f5e" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.970952 4873 scope.go:117] "RemoveContainer" containerID="bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf" Feb 19 10:36:59 crc kubenswrapper[4873]: E0219 10:36:59.971617 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf\": container with ID starting with bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf not found: ID does not exist" containerID="bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.971682 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf"} err="failed to get container status \"bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf\": rpc error: code = NotFound desc = could not find container \"bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf\": container with ID starting with bde6b244ff8c35b06ab9e678a9ef0296297322519a75b1850656167406d50fbf not found: ID does not exist" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.971718 4873 scope.go:117] "RemoveContainer" containerID="195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821" Feb 19 10:36:59 crc kubenswrapper[4873]: E0219 10:36:59.972210 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821\": container with ID starting with 195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821 not found: ID does not exist" containerID="195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.972262 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821"} err="failed to get container status \"195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821\": rpc error: code = NotFound desc = could not find container \"195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821\": container with ID starting with 195edfdb7a76b47f01d4a03025fadd56103a54b406afbcead83f16f6187d6821 not found: ID does not exist" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.972298 4873 scope.go:117] "RemoveContainer" containerID="595867504c3014b4b9ad93c216f9e475546590ca7b34fa0cf9a413a0376f5f5e" Feb 19 10:36:59 crc kubenswrapper[4873]: E0219 10:36:59.972993 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595867504c3014b4b9ad93c216f9e475546590ca7b34fa0cf9a413a0376f5f5e\": container with ID starting with 595867504c3014b4b9ad93c216f9e475546590ca7b34fa0cf9a413a0376f5f5e not found: ID does not exist" containerID="595867504c3014b4b9ad93c216f9e475546590ca7b34fa0cf9a413a0376f5f5e" Feb 19 10:36:59 crc kubenswrapper[4873]: I0219 10:36:59.973071 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595867504c3014b4b9ad93c216f9e475546590ca7b34fa0cf9a413a0376f5f5e"} err="failed to get container status \"595867504c3014b4b9ad93c216f9e475546590ca7b34fa0cf9a413a0376f5f5e\": rpc error: code = NotFound desc = could not find container \"595867504c3014b4b9ad93c216f9e475546590ca7b34fa0cf9a413a0376f5f5e\": container with ID starting with 595867504c3014b4b9ad93c216f9e475546590ca7b34fa0cf9a413a0376f5f5e not found: ID does not exist" Feb 19 10:37:01 crc kubenswrapper[4873]: I0219 10:37:01.499674 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" path="/var/lib/kubelet/pods/70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd/volumes" Feb 19 10:37:09 crc kubenswrapper[4873]: I0219 10:37:09.488010 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:37:09 crc kubenswrapper[4873]: E0219 10:37:09.488920 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:37:20 crc kubenswrapper[4873]: I0219 10:37:20.483669 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:37:20 crc kubenswrapper[4873]: E0219 10:37:20.484503 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:37:32 crc kubenswrapper[4873]: I0219 10:37:32.484141 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:37:32 crc kubenswrapper[4873]: E0219 10:37:32.484860 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:37:44 crc kubenswrapper[4873]: I0219 10:37:44.486405 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:37:44 crc kubenswrapper[4873]: E0219 10:37:44.488524 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:37:58 crc kubenswrapper[4873]: I0219 10:37:58.484757 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:37:58 crc kubenswrapper[4873]: E0219 10:37:58.485540 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:38:12 crc kubenswrapper[4873]: I0219 10:38:12.484766 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:38:12 crc kubenswrapper[4873]: E0219 10:38:12.485807 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:38:27 crc kubenswrapper[4873]: I0219 10:38:27.484518 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:38:28 crc kubenswrapper[4873]: I0219 10:38:28.730781 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"0d157f8531ba54c5f5368e05cf7f1a865a7d6835f71ea27ae242a81b66903a7f"} Feb 19 10:40:48 crc kubenswrapper[4873]: I0219 10:40:48.240769 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:40:48 crc kubenswrapper[4873]: I0219 10:40:48.241324 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:41:18 crc kubenswrapper[4873]: I0219 10:41:18.240844 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:41:18 crc kubenswrapper[4873]: I0219 10:41:18.242076 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:41:48 crc kubenswrapper[4873]: I0219 10:41:48.240612 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:41:48 crc kubenswrapper[4873]: I0219 10:41:48.241318 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:41:48 crc kubenswrapper[4873]: I0219 10:41:48.241368 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:41:48 crc kubenswrapper[4873]: I0219 10:41:48.242323 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d157f8531ba54c5f5368e05cf7f1a865a7d6835f71ea27ae242a81b66903a7f"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:41:48 crc kubenswrapper[4873]: I0219 10:41:48.242380 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://0d157f8531ba54c5f5368e05cf7f1a865a7d6835f71ea27ae242a81b66903a7f" gracePeriod=600 Feb 19 10:41:49 crc kubenswrapper[4873]: I0219 10:41:49.097919 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="0d157f8531ba54c5f5368e05cf7f1a865a7d6835f71ea27ae242a81b66903a7f" exitCode=0 Feb 19 10:41:49 crc kubenswrapper[4873]: I0219 10:41:49.098088 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"0d157f8531ba54c5f5368e05cf7f1a865a7d6835f71ea27ae242a81b66903a7f"} Feb 19 10:41:49 crc kubenswrapper[4873]: I0219 10:41:49.098471 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4"} Feb 19 10:41:49 crc kubenswrapper[4873]: I0219 10:41:49.098490 4873 scope.go:117] "RemoveContainer" containerID="626a71f4870e297e5c29d320555202aa31c88919871bbf2657a131f7539b445b" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.419699 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l45tw"] Feb 19 10:42:54 crc kubenswrapper[4873]: E0219 10:42:54.421201 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" containerName="extract-content" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.421218 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" containerName="extract-content" Feb 19 10:42:54 crc kubenswrapper[4873]: E0219 10:42:54.421232 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" containerName="extract-utilities" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.421241 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" containerName="extract-utilities" Feb 19 10:42:54 crc kubenswrapper[4873]: E0219 10:42:54.421307 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" containerName="registry-server" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.421318 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" containerName="registry-server" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.422257 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c4c2b0-7e7a-4e61-b63a-ee5b16d32ebd" containerName="registry-server" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.428640 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.479860 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l45tw"] Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.545213 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-catalog-content\") pod \"community-operators-l45tw\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.545323 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-utilities\") pod \"community-operators-l45tw\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.545473 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c94h8\" (UniqueName: \"kubernetes.io/projected/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-kube-api-access-c94h8\") pod \"community-operators-l45tw\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.647653 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-catalog-content\") pod \"community-operators-l45tw\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.647745 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-utilities\") pod \"community-operators-l45tw\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.648344 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-catalog-content\") pod \"community-operators-l45tw\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.648479 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-utilities\") pod \"community-operators-l45tw\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.648513 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c94h8\" (UniqueName: \"kubernetes.io/projected/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-kube-api-access-c94h8\") pod \"community-operators-l45tw\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.671562 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c94h8\" (UniqueName: \"kubernetes.io/projected/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-kube-api-access-c94h8\") pod \"community-operators-l45tw\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:42:54 crc kubenswrapper[4873]: I0219 10:42:54.768164 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:42:55 crc kubenswrapper[4873]: I0219 10:42:55.327540 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l45tw"] Feb 19 10:42:55 crc kubenswrapper[4873]: I0219 10:42:55.721418 4873 generic.go:334] "Generic (PLEG): container finished" podID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" containerID="439235b1c3b4e94efa982e6a0cecbae46412fc2a2189779e92351095c7281eab" exitCode=0 Feb 19 10:42:55 crc kubenswrapper[4873]: I0219 10:42:55.721497 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45tw" event={"ID":"b9bd624b-d288-4fc4-a24c-1e3283b10bf6","Type":"ContainerDied","Data":"439235b1c3b4e94efa982e6a0cecbae46412fc2a2189779e92351095c7281eab"} Feb 19 10:42:55 crc kubenswrapper[4873]: I0219 10:42:55.721753 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45tw" event={"ID":"b9bd624b-d288-4fc4-a24c-1e3283b10bf6","Type":"ContainerStarted","Data":"d8a69facd98b183ff4ff2a4293c6b027e750acad29ea74e2ec4a6d345726c9cf"} Feb 19 10:42:55 crc kubenswrapper[4873]: I0219 10:42:55.724252 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 10:42:56 crc kubenswrapper[4873]: I0219 10:42:56.736124 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45tw" event={"ID":"b9bd624b-d288-4fc4-a24c-1e3283b10bf6","Type":"ContainerStarted","Data":"d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f"} Feb 19 10:43:00 crc kubenswrapper[4873]: I0219 10:43:00.781185 4873 generic.go:334] "Generic (PLEG): container finished" podID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" containerID="d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f" exitCode=0 Feb 19 10:43:00 crc kubenswrapper[4873]: I0219 10:43:00.781270 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45tw" event={"ID":"b9bd624b-d288-4fc4-a24c-1e3283b10bf6","Type":"ContainerDied","Data":"d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f"} Feb 19 10:43:01 crc kubenswrapper[4873]: I0219 10:43:01.796688 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45tw" event={"ID":"b9bd624b-d288-4fc4-a24c-1e3283b10bf6","Type":"ContainerStarted","Data":"50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d"} Feb 19 10:43:01 crc kubenswrapper[4873]: I0219 10:43:01.826859 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l45tw" podStartSLOduration=2.263772351 podStartE2EDuration="7.826840958s" podCreationTimestamp="2026-02-19 10:42:54 +0000 UTC" firstStartedPulling="2026-02-19 10:42:55.723947228 +0000 UTC m=+3485.013378866" lastFinishedPulling="2026-02-19 10:43:01.287015825 +0000 UTC m=+3490.576447473" observedRunningTime="2026-02-19 10:43:01.822985591 +0000 UTC m=+3491.112417229" watchObservedRunningTime="2026-02-19 10:43:01.826840958 +0000 UTC m=+3491.116272616" Feb 19 10:43:04 crc kubenswrapper[4873]: I0219 10:43:04.769428 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:43:04 crc kubenswrapper[4873]: I0219 10:43:04.769756 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:43:04 crc kubenswrapper[4873]: I0219 10:43:04.814616 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:43:14 crc kubenswrapper[4873]: I0219 10:43:14.818051 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:43:14 crc kubenswrapper[4873]: I0219 10:43:14.875377 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l45tw"] Feb 19 10:43:14 crc kubenswrapper[4873]: I0219 10:43:14.953264 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l45tw" podUID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" containerName="registry-server" containerID="cri-o://50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d" gracePeriod=2 Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.517174 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.603152 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-catalog-content\") pod \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.603215 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-utilities\") pod \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.603306 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c94h8\" (UniqueName: \"kubernetes.io/projected/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-kube-api-access-c94h8\") pod \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\" (UID: \"b9bd624b-d288-4fc4-a24c-1e3283b10bf6\") " Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.605063 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-utilities" (OuterVolumeSpecName: "utilities") pod "b9bd624b-d288-4fc4-a24c-1e3283b10bf6" (UID: "b9bd624b-d288-4fc4-a24c-1e3283b10bf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.611147 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-kube-api-access-c94h8" (OuterVolumeSpecName: "kube-api-access-c94h8") pod "b9bd624b-d288-4fc4-a24c-1e3283b10bf6" (UID: "b9bd624b-d288-4fc4-a24c-1e3283b10bf6"). InnerVolumeSpecName "kube-api-access-c94h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.672464 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9bd624b-d288-4fc4-a24c-1e3283b10bf6" (UID: "b9bd624b-d288-4fc4-a24c-1e3283b10bf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.705507 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.705554 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.705571 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c94h8\" (UniqueName: \"kubernetes.io/projected/b9bd624b-d288-4fc4-a24c-1e3283b10bf6-kube-api-access-c94h8\") on node \"crc\" DevicePath \"\"" Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.967431 4873 generic.go:334] "Generic (PLEG): container finished" podID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" containerID="50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d" exitCode=0 Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.967522 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45tw" event={"ID":"b9bd624b-d288-4fc4-a24c-1e3283b10bf6","Type":"ContainerDied","Data":"50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d"} Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.967570 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45tw" event={"ID":"b9bd624b-d288-4fc4-a24c-1e3283b10bf6","Type":"ContainerDied","Data":"d8a69facd98b183ff4ff2a4293c6b027e750acad29ea74e2ec4a6d345726c9cf"} Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.967600 4873 scope.go:117] "RemoveContainer" containerID="50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d" Feb 19 10:43:15 crc kubenswrapper[4873]: I0219 10:43:15.967890 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45tw" Feb 19 10:43:16 crc kubenswrapper[4873]: I0219 10:43:16.009888 4873 scope.go:117] "RemoveContainer" containerID="d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f" Feb 19 10:43:16 crc kubenswrapper[4873]: I0219 10:43:16.017702 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l45tw"] Feb 19 10:43:16 crc kubenswrapper[4873]: I0219 10:43:16.032066 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l45tw"] Feb 19 10:43:16 crc kubenswrapper[4873]: I0219 10:43:16.037449 4873 scope.go:117] "RemoveContainer" containerID="439235b1c3b4e94efa982e6a0cecbae46412fc2a2189779e92351095c7281eab" Feb 19 10:43:16 crc kubenswrapper[4873]: I0219 10:43:16.087505 4873 scope.go:117] "RemoveContainer" containerID="50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d" Feb 19 10:43:16 crc kubenswrapper[4873]: E0219 10:43:16.088980 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d\": container with ID starting with 50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d not found: ID does not exist" containerID="50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d" Feb 19 10:43:16 crc kubenswrapper[4873]: I0219 10:43:16.089056 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d"} err="failed to get container status \"50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d\": rpc error: code = NotFound desc = could not find container \"50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d\": container with ID starting with 50764a4aca3b93a56a6dd657a34c269f0c8b72265385e7bce385f947dbb29c8d not found: ID does not exist" Feb 19 10:43:16 crc kubenswrapper[4873]: I0219 10:43:16.089121 4873 scope.go:117] "RemoveContainer" containerID="d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f" Feb 19 10:43:16 crc kubenswrapper[4873]: E0219 10:43:16.089868 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f\": container with ID starting with d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f not found: ID does not exist" containerID="d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f" Feb 19 10:43:16 crc kubenswrapper[4873]: I0219 10:43:16.089916 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f"} err="failed to get container status \"d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f\": rpc error: code = NotFound desc = could not find container \"d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f\": container with ID starting with d95bd0f4fa462c66a1498316d7b5c05f7c7eab1217db9c00ff06c38a64b9e64f not found: ID does not exist" Feb 19 10:43:16 crc kubenswrapper[4873]: I0219 10:43:16.089957 4873 scope.go:117] "RemoveContainer" containerID="439235b1c3b4e94efa982e6a0cecbae46412fc2a2189779e92351095c7281eab" Feb 19 10:43:16 crc kubenswrapper[4873]: E0219 10:43:16.090419 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"439235b1c3b4e94efa982e6a0cecbae46412fc2a2189779e92351095c7281eab\": container with ID starting with 439235b1c3b4e94efa982e6a0cecbae46412fc2a2189779e92351095c7281eab not found: ID does not exist" containerID="439235b1c3b4e94efa982e6a0cecbae46412fc2a2189779e92351095c7281eab" Feb 19 10:43:16 crc kubenswrapper[4873]: I0219 10:43:16.090456 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"439235b1c3b4e94efa982e6a0cecbae46412fc2a2189779e92351095c7281eab"} err="failed to get container status \"439235b1c3b4e94efa982e6a0cecbae46412fc2a2189779e92351095c7281eab\": rpc error: code = NotFound desc = could not find container \"439235b1c3b4e94efa982e6a0cecbae46412fc2a2189779e92351095c7281eab\": container with ID starting with 439235b1c3b4e94efa982e6a0cecbae46412fc2a2189779e92351095c7281eab not found: ID does not exist" Feb 19 10:43:17 crc kubenswrapper[4873]: I0219 10:43:17.498027 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" path="/var/lib/kubelet/pods/b9bd624b-d288-4fc4-a24c-1e3283b10bf6/volumes" Feb 19 10:43:48 crc kubenswrapper[4873]: I0219 10:43:48.240935 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:43:48 crc kubenswrapper[4873]: I0219 10:43:48.241576 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:44:18 crc kubenswrapper[4873]: I0219 10:44:18.241021 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:44:18 crc kubenswrapper[4873]: I0219 10:44:18.241591 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:44:48 crc kubenswrapper[4873]: I0219 10:44:48.240990 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:44:48 crc kubenswrapper[4873]: I0219 10:44:48.241801 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:44:48 crc kubenswrapper[4873]: I0219 10:44:48.242074 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:44:48 crc kubenswrapper[4873]: I0219 10:44:48.243199 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:44:48 crc kubenswrapper[4873]: I0219 10:44:48.243287 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" gracePeriod=600 Feb 19 10:44:48 crc kubenswrapper[4873]: E0219 10:44:48.368264 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:44:49 crc kubenswrapper[4873]: I0219 10:44:49.212979 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" exitCode=0 Feb 19 10:44:49 crc kubenswrapper[4873]: I0219 10:44:49.213085 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4"} Feb 19 10:44:49 crc kubenswrapper[4873]: I0219 10:44:49.213228 4873 scope.go:117] "RemoveContainer" containerID="0d157f8531ba54c5f5368e05cf7f1a865a7d6835f71ea27ae242a81b66903a7f" Feb 19 10:44:49 crc kubenswrapper[4873]: I0219 10:44:49.215212 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:44:49 crc kubenswrapper[4873]: E0219 10:44:49.216214 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.165452 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6"] Feb 19 10:45:00 crc kubenswrapper[4873]: E0219 10:45:00.166665 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" containerName="extract-content" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.166684 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" containerName="extract-content" Feb 19 10:45:00 crc kubenswrapper[4873]: E0219 10:45:00.166701 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" containerName="registry-server" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.166709 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" containerName="registry-server" Feb 19 10:45:00 crc kubenswrapper[4873]: E0219 10:45:00.166761 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" containerName="extract-utilities" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.166770 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" containerName="extract-utilities" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.167016 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9bd624b-d288-4fc4-a24c-1e3283b10bf6" containerName="registry-server" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.167846 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.175517 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6"] Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.182604 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.182772 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.301889 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-secret-volume\") pod \"collect-profiles-29524965-7h5c6\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.302538 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-config-volume\") pod \"collect-profiles-29524965-7h5c6\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.302914 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twjnt\" (UniqueName: \"kubernetes.io/projected/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-kube-api-access-twjnt\") pod \"collect-profiles-29524965-7h5c6\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.404161 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-secret-volume\") pod \"collect-profiles-29524965-7h5c6\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.404303 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-config-volume\") pod \"collect-profiles-29524965-7h5c6\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.404398 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twjnt\" (UniqueName: \"kubernetes.io/projected/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-kube-api-access-twjnt\") pod \"collect-profiles-29524965-7h5c6\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.405242 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-config-volume\") pod \"collect-profiles-29524965-7h5c6\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.415226 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-secret-volume\") pod \"collect-profiles-29524965-7h5c6\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.424910 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twjnt\" (UniqueName: \"kubernetes.io/projected/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-kube-api-access-twjnt\") pod \"collect-profiles-29524965-7h5c6\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.505272 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:00 crc kubenswrapper[4873]: I0219 10:45:00.964826 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6"] Feb 19 10:45:01 crc kubenswrapper[4873]: I0219 10:45:01.332521 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" event={"ID":"e3672337-92bc-4e97-9c9e-c0a7e7cd284b","Type":"ContainerStarted","Data":"54b95d1d4eacbeaa7320e5a5833d0056b13e15ee90dc6b62a2553c6f88d2fff8"} Feb 19 10:45:01 crc kubenswrapper[4873]: I0219 10:45:01.334218 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" event={"ID":"e3672337-92bc-4e97-9c9e-c0a7e7cd284b","Type":"ContainerStarted","Data":"a7e7997adc2bfeb5ccc5b987d6d0f9aa1b9bafe6a9bab781afe26c452dd4c597"} Feb 19 10:45:01 crc kubenswrapper[4873]: I0219 10:45:01.360503 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" podStartSLOduration=1.360468367 podStartE2EDuration="1.360468367s" podCreationTimestamp="2026-02-19 10:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 10:45:01.351684837 +0000 UTC m=+3610.641116515" watchObservedRunningTime="2026-02-19 10:45:01.360468367 +0000 UTC m=+3610.649900005" Feb 19 10:45:02 crc kubenswrapper[4873]: I0219 10:45:02.342426 4873 generic.go:334] "Generic (PLEG): container finished" podID="e3672337-92bc-4e97-9c9e-c0a7e7cd284b" containerID="54b95d1d4eacbeaa7320e5a5833d0056b13e15ee90dc6b62a2553c6f88d2fff8" exitCode=0 Feb 19 10:45:02 crc kubenswrapper[4873]: I0219 10:45:02.342554 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" event={"ID":"e3672337-92bc-4e97-9c9e-c0a7e7cd284b","Type":"ContainerDied","Data":"54b95d1d4eacbeaa7320e5a5833d0056b13e15ee90dc6b62a2553c6f88d2fff8"} Feb 19 10:45:03 crc kubenswrapper[4873]: I0219 10:45:03.485943 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:45:03 crc kubenswrapper[4873]: E0219 10:45:03.486513 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:45:03 crc kubenswrapper[4873]: I0219 10:45:03.681292 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:03 crc kubenswrapper[4873]: I0219 10:45:03.883960 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-secret-volume\") pod \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " Feb 19 10:45:03 crc kubenswrapper[4873]: I0219 10:45:03.884249 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-config-volume\") pod \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " Feb 19 10:45:03 crc kubenswrapper[4873]: I0219 10:45:03.884398 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twjnt\" (UniqueName: \"kubernetes.io/projected/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-kube-api-access-twjnt\") pod \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\" (UID: \"e3672337-92bc-4e97-9c9e-c0a7e7cd284b\") " Feb 19 10:45:03 crc kubenswrapper[4873]: I0219 10:45:03.885433 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-config-volume" (OuterVolumeSpecName: "config-volume") pod "e3672337-92bc-4e97-9c9e-c0a7e7cd284b" (UID: "e3672337-92bc-4e97-9c9e-c0a7e7cd284b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 10:45:03 crc kubenswrapper[4873]: I0219 10:45:03.891246 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e3672337-92bc-4e97-9c9e-c0a7e7cd284b" (UID: "e3672337-92bc-4e97-9c9e-c0a7e7cd284b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 10:45:03 crc kubenswrapper[4873]: I0219 10:45:03.892440 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-kube-api-access-twjnt" (OuterVolumeSpecName: "kube-api-access-twjnt") pod "e3672337-92bc-4e97-9c9e-c0a7e7cd284b" (UID: "e3672337-92bc-4e97-9c9e-c0a7e7cd284b"). InnerVolumeSpecName "kube-api-access-twjnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:45:03 crc kubenswrapper[4873]: I0219 10:45:03.986732 4873 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 10:45:03 crc kubenswrapper[4873]: I0219 10:45:03.986767 4873 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 10:45:03 crc kubenswrapper[4873]: I0219 10:45:03.986777 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twjnt\" (UniqueName: \"kubernetes.io/projected/e3672337-92bc-4e97-9c9e-c0a7e7cd284b-kube-api-access-twjnt\") on node \"crc\" DevicePath \"\"" Feb 19 10:45:04 crc kubenswrapper[4873]: I0219 10:45:04.364803 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" event={"ID":"e3672337-92bc-4e97-9c9e-c0a7e7cd284b","Type":"ContainerDied","Data":"a7e7997adc2bfeb5ccc5b987d6d0f9aa1b9bafe6a9bab781afe26c452dd4c597"} Feb 19 10:45:04 crc kubenswrapper[4873]: I0219 10:45:04.364880 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7e7997adc2bfeb5ccc5b987d6d0f9aa1b9bafe6a9bab781afe26c452dd4c597" Feb 19 10:45:04 crc kubenswrapper[4873]: I0219 10:45:04.364998 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6" Feb 19 10:45:04 crc kubenswrapper[4873]: I0219 10:45:04.767091 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm"] Feb 19 10:45:04 crc kubenswrapper[4873]: I0219 10:45:04.776924 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524920-796dm"] Feb 19 10:45:05 crc kubenswrapper[4873]: I0219 10:45:05.504593 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890a4af6-c400-4f2c-a387-edcbbc821b11" path="/var/lib/kubelet/pods/890a4af6-c400-4f2c-a387-edcbbc821b11/volumes" Feb 19 10:45:18 crc kubenswrapper[4873]: I0219 10:45:18.485370 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:45:18 crc kubenswrapper[4873]: E0219 10:45:18.486353 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:45:32 crc kubenswrapper[4873]: I0219 10:45:32.484768 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:45:32 crc kubenswrapper[4873]: E0219 10:45:32.486744 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:45:47 crc kubenswrapper[4873]: I0219 10:45:47.484519 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:45:47 crc kubenswrapper[4873]: E0219 10:45:47.485317 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:46:00 crc kubenswrapper[4873]: I0219 10:46:00.484605 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:46:00 crc kubenswrapper[4873]: E0219 10:46:00.485408 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:46:02 crc kubenswrapper[4873]: I0219 10:46:02.627239 4873 scope.go:117] "RemoveContainer" containerID="2ea87556ea1e2777f378238131c83ccd55a7eac5410c13097afbd46ee33f0929" Feb 19 10:46:11 crc kubenswrapper[4873]: I0219 10:46:11.490415 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:46:11 crc kubenswrapper[4873]: E0219 10:46:11.493375 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:46:23 crc kubenswrapper[4873]: I0219 10:46:23.484187 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:46:23 crc kubenswrapper[4873]: E0219 10:46:23.485033 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:46:34 crc kubenswrapper[4873]: I0219 10:46:34.483884 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:46:34 crc kubenswrapper[4873]: E0219 10:46:34.484740 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:46:48 crc kubenswrapper[4873]: I0219 10:46:48.485044 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:46:48 crc kubenswrapper[4873]: E0219 10:46:48.485876 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:47:03 crc kubenswrapper[4873]: I0219 10:47:03.484555 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:47:03 crc kubenswrapper[4873]: E0219 10:47:03.485499 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:47:17 crc kubenswrapper[4873]: I0219 10:47:17.484476 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:47:17 crc kubenswrapper[4873]: E0219 10:47:17.485364 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:47:29 crc kubenswrapper[4873]: I0219 10:47:29.484525 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:47:29 crc kubenswrapper[4873]: E0219 10:47:29.485362 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:47:44 crc kubenswrapper[4873]: I0219 10:47:44.493326 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:47:44 crc kubenswrapper[4873]: E0219 10:47:44.495785 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.206288 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gn8wh"] Feb 19 10:47:51 crc kubenswrapper[4873]: E0219 10:47:51.207465 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3672337-92bc-4e97-9c9e-c0a7e7cd284b" containerName="collect-profiles" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.207486 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3672337-92bc-4e97-9c9e-c0a7e7cd284b" containerName="collect-profiles" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.207786 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3672337-92bc-4e97-9c9e-c0a7e7cd284b" containerName="collect-profiles" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.209744 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.227155 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn8wh"] Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.327770 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-utilities\") pod \"certified-operators-gn8wh\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.327995 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-catalog-content\") pod \"certified-operators-gn8wh\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.328028 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26vkd\" (UniqueName: \"kubernetes.io/projected/a65669cf-686e-4ae9-a210-66ae759bfe37-kube-api-access-26vkd\") pod \"certified-operators-gn8wh\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.430445 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-utilities\") pod \"certified-operators-gn8wh\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.430642 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-catalog-content\") pod \"certified-operators-gn8wh\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.430675 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26vkd\" (UniqueName: \"kubernetes.io/projected/a65669cf-686e-4ae9-a210-66ae759bfe37-kube-api-access-26vkd\") pod \"certified-operators-gn8wh\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.431118 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-utilities\") pod \"certified-operators-gn8wh\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.431209 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-catalog-content\") pod \"certified-operators-gn8wh\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.456257 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26vkd\" (UniqueName: \"kubernetes.io/projected/a65669cf-686e-4ae9-a210-66ae759bfe37-kube-api-access-26vkd\") pod \"certified-operators-gn8wh\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:47:51 crc kubenswrapper[4873]: I0219 10:47:51.540793 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:47:52 crc kubenswrapper[4873]: I0219 10:47:52.183453 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn8wh"] Feb 19 10:47:52 crc kubenswrapper[4873]: I0219 10:47:52.231943 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8wh" event={"ID":"a65669cf-686e-4ae9-a210-66ae759bfe37","Type":"ContainerStarted","Data":"15e6bed6cf4319505dcff2fd62cf8014563b596799be90375f21165a31ca87b7"} Feb 19 10:47:53 crc kubenswrapper[4873]: I0219 10:47:53.248256 4873 generic.go:334] "Generic (PLEG): container finished" podID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerID="d0bcc2985ae3863253dbcfc0804c289bd2f9b9d3b0af599bfe995ac18937285d" exitCode=0 Feb 19 10:47:53 crc kubenswrapper[4873]: I0219 10:47:53.248353 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8wh" event={"ID":"a65669cf-686e-4ae9-a210-66ae759bfe37","Type":"ContainerDied","Data":"d0bcc2985ae3863253dbcfc0804c289bd2f9b9d3b0af599bfe995ac18937285d"} Feb 19 10:47:55 crc kubenswrapper[4873]: I0219 10:47:55.265923 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8wh" event={"ID":"a65669cf-686e-4ae9-a210-66ae759bfe37","Type":"ContainerStarted","Data":"43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f"} Feb 19 10:47:56 crc kubenswrapper[4873]: I0219 10:47:56.484459 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:47:56 crc kubenswrapper[4873]: E0219 10:47:56.484967 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:47:59 crc kubenswrapper[4873]: I0219 10:47:59.318956 4873 generic.go:334] "Generic (PLEG): container finished" podID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerID="43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f" exitCode=0 Feb 19 10:47:59 crc kubenswrapper[4873]: I0219 10:47:59.318996 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8wh" event={"ID":"a65669cf-686e-4ae9-a210-66ae759bfe37","Type":"ContainerDied","Data":"43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f"} Feb 19 10:47:59 crc kubenswrapper[4873]: I0219 10:47:59.323217 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 10:48:00 crc kubenswrapper[4873]: I0219 10:48:00.346805 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8wh" event={"ID":"a65669cf-686e-4ae9-a210-66ae759bfe37","Type":"ContainerStarted","Data":"ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161"} Feb 19 10:48:00 crc kubenswrapper[4873]: I0219 10:48:00.373522 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gn8wh" podStartSLOduration=2.767193308 podStartE2EDuration="9.37350121s" podCreationTimestamp="2026-02-19 10:47:51 +0000 UTC" firstStartedPulling="2026-02-19 10:47:53.251381586 +0000 UTC m=+3782.540813264" lastFinishedPulling="2026-02-19 10:47:59.857689538 +0000 UTC m=+3789.147121166" observedRunningTime="2026-02-19 10:48:00.364660529 +0000 UTC m=+3789.654092167" watchObservedRunningTime="2026-02-19 10:48:00.37350121 +0000 UTC m=+3789.662932848" Feb 19 10:48:01 crc kubenswrapper[4873]: I0219 10:48:01.541596 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:48:01 crc kubenswrapper[4873]: I0219 10:48:01.541981 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:48:02 crc kubenswrapper[4873]: I0219 10:48:02.586520 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-gn8wh" podUID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerName="registry-server" probeResult="failure" output=< Feb 19 10:48:02 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 10:48:02 crc kubenswrapper[4873]: > Feb 19 10:48:11 crc kubenswrapper[4873]: I0219 10:48:11.491014 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:48:11 crc kubenswrapper[4873]: E0219 10:48:11.491767 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:48:11 crc kubenswrapper[4873]: I0219 10:48:11.597077 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:48:11 crc kubenswrapper[4873]: I0219 10:48:11.647956 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:48:11 crc kubenswrapper[4873]: I0219 10:48:11.839895 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn8wh"] Feb 19 10:48:13 crc kubenswrapper[4873]: I0219 10:48:13.473401 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gn8wh" podUID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerName="registry-server" containerID="cri-o://ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161" gracePeriod=2 Feb 19 10:48:13 crc kubenswrapper[4873]: I0219 10:48:13.999121 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.120717 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26vkd\" (UniqueName: \"kubernetes.io/projected/a65669cf-686e-4ae9-a210-66ae759bfe37-kube-api-access-26vkd\") pod \"a65669cf-686e-4ae9-a210-66ae759bfe37\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.121280 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-utilities\") pod \"a65669cf-686e-4ae9-a210-66ae759bfe37\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.121328 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-catalog-content\") pod \"a65669cf-686e-4ae9-a210-66ae759bfe37\" (UID: \"a65669cf-686e-4ae9-a210-66ae759bfe37\") " Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.122659 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-utilities" (OuterVolumeSpecName: "utilities") pod "a65669cf-686e-4ae9-a210-66ae759bfe37" (UID: "a65669cf-686e-4ae9-a210-66ae759bfe37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.128380 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a65669cf-686e-4ae9-a210-66ae759bfe37-kube-api-access-26vkd" (OuterVolumeSpecName: "kube-api-access-26vkd") pod "a65669cf-686e-4ae9-a210-66ae759bfe37" (UID: "a65669cf-686e-4ae9-a210-66ae759bfe37"). InnerVolumeSpecName "kube-api-access-26vkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.189753 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a65669cf-686e-4ae9-a210-66ae759bfe37" (UID: "a65669cf-686e-4ae9-a210-66ae759bfe37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.223866 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26vkd\" (UniqueName: \"kubernetes.io/projected/a65669cf-686e-4ae9-a210-66ae759bfe37-kube-api-access-26vkd\") on node \"crc\" DevicePath \"\"" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.223908 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.223918 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a65669cf-686e-4ae9-a210-66ae759bfe37-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.513428 4873 generic.go:334] "Generic (PLEG): container finished" podID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerID="ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161" exitCode=0 Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.513495 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8wh" event={"ID":"a65669cf-686e-4ae9-a210-66ae759bfe37","Type":"ContainerDied","Data":"ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161"} Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.513559 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn8wh" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.513583 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn8wh" event={"ID":"a65669cf-686e-4ae9-a210-66ae759bfe37","Type":"ContainerDied","Data":"15e6bed6cf4319505dcff2fd62cf8014563b596799be90375f21165a31ca87b7"} Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.513610 4873 scope.go:117] "RemoveContainer" containerID="ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.549538 4873 scope.go:117] "RemoveContainer" containerID="43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.551039 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn8wh"] Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.561046 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gn8wh"] Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.570642 4873 scope.go:117] "RemoveContainer" containerID="d0bcc2985ae3863253dbcfc0804c289bd2f9b9d3b0af599bfe995ac18937285d" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.622438 4873 scope.go:117] "RemoveContainer" containerID="ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161" Feb 19 10:48:14 crc kubenswrapper[4873]: E0219 10:48:14.622972 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161\": container with ID starting with ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161 not found: ID does not exist" containerID="ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.623020 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161"} err="failed to get container status \"ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161\": rpc error: code = NotFound desc = could not find container \"ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161\": container with ID starting with ebe44c5d8bed094f0deb2d3af386d2b9793282994e7d65f3a10ce5623c780161 not found: ID does not exist" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.623048 4873 scope.go:117] "RemoveContainer" containerID="43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f" Feb 19 10:48:14 crc kubenswrapper[4873]: E0219 10:48:14.623470 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f\": container with ID starting with 43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f not found: ID does not exist" containerID="43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.623492 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f"} err="failed to get container status \"43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f\": rpc error: code = NotFound desc = could not find container \"43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f\": container with ID starting with 43e61c3a28ba9a4e810156c46101fe16c402171370482f2ea90d30248e76b25f not found: ID does not exist" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.623504 4873 scope.go:117] "RemoveContainer" containerID="d0bcc2985ae3863253dbcfc0804c289bd2f9b9d3b0af599bfe995ac18937285d" Feb 19 10:48:14 crc kubenswrapper[4873]: E0219 10:48:14.623934 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0bcc2985ae3863253dbcfc0804c289bd2f9b9d3b0af599bfe995ac18937285d\": container with ID starting with d0bcc2985ae3863253dbcfc0804c289bd2f9b9d3b0af599bfe995ac18937285d not found: ID does not exist" containerID="d0bcc2985ae3863253dbcfc0804c289bd2f9b9d3b0af599bfe995ac18937285d" Feb 19 10:48:14 crc kubenswrapper[4873]: I0219 10:48:14.624012 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0bcc2985ae3863253dbcfc0804c289bd2f9b9d3b0af599bfe995ac18937285d"} err="failed to get container status \"d0bcc2985ae3863253dbcfc0804c289bd2f9b9d3b0af599bfe995ac18937285d\": rpc error: code = NotFound desc = could not find container \"d0bcc2985ae3863253dbcfc0804c289bd2f9b9d3b0af599bfe995ac18937285d\": container with ID starting with d0bcc2985ae3863253dbcfc0804c289bd2f9b9d3b0af599bfe995ac18937285d not found: ID does not exist" Feb 19 10:48:15 crc kubenswrapper[4873]: I0219 10:48:15.495797 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a65669cf-686e-4ae9-a210-66ae759bfe37" path="/var/lib/kubelet/pods/a65669cf-686e-4ae9-a210-66ae759bfe37/volumes" Feb 19 10:48:23 crc kubenswrapper[4873]: I0219 10:48:23.484300 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:48:23 crc kubenswrapper[4873]: E0219 10:48:23.485184 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:48:34 crc kubenswrapper[4873]: I0219 10:48:34.484529 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:48:34 crc kubenswrapper[4873]: E0219 10:48:34.485528 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:48:49 crc kubenswrapper[4873]: I0219 10:48:49.484753 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:48:49 crc kubenswrapper[4873]: E0219 10:48:49.485578 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:49:03 crc kubenswrapper[4873]: I0219 10:49:03.484114 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:49:03 crc kubenswrapper[4873]: E0219 10:49:03.485003 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:49:16 crc kubenswrapper[4873]: I0219 10:49:16.484068 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:49:16 crc kubenswrapper[4873]: E0219 10:49:16.484885 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:49:30 crc kubenswrapper[4873]: I0219 10:49:30.484679 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:49:30 crc kubenswrapper[4873]: E0219 10:49:30.485222 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:49:43 crc kubenswrapper[4873]: I0219 10:49:43.485286 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:49:43 crc kubenswrapper[4873]: E0219 10:49:43.486567 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:49:57 crc kubenswrapper[4873]: I0219 10:49:57.484602 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:49:57 crc kubenswrapper[4873]: I0219 10:49:57.935448 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"3f289291e3d6f52c31a0d326462313d44367b683964d6ef342d209f000362ec9"} Feb 19 10:52:18 crc kubenswrapper[4873]: I0219 10:52:18.240161 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:52:18 crc kubenswrapper[4873]: I0219 10:52:18.240931 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.458340 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r78w4"] Feb 19 10:52:25 crc kubenswrapper[4873]: E0219 10:52:25.459389 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerName="extract-content" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.459405 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerName="extract-content" Feb 19 10:52:25 crc kubenswrapper[4873]: E0219 10:52:25.459444 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerName="extract-utilities" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.459454 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerName="extract-utilities" Feb 19 10:52:25 crc kubenswrapper[4873]: E0219 10:52:25.459497 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerName="registry-server" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.459506 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerName="registry-server" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.459741 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a65669cf-686e-4ae9-a210-66ae759bfe37" containerName="registry-server" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.461714 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.502318 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r78w4"] Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.601516 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-catalog-content\") pod \"redhat-operators-r78w4\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.601656 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqm5h\" (UniqueName: \"kubernetes.io/projected/0d91352c-7639-49f0-baf7-bd343bb59c42-kube-api-access-vqm5h\") pod \"redhat-operators-r78w4\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.601874 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-utilities\") pod \"redhat-operators-r78w4\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.703278 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-catalog-content\") pod \"redhat-operators-r78w4\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.703597 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqm5h\" (UniqueName: \"kubernetes.io/projected/0d91352c-7639-49f0-baf7-bd343bb59c42-kube-api-access-vqm5h\") pod \"redhat-operators-r78w4\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.703766 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-catalog-content\") pod \"redhat-operators-r78w4\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.703778 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-utilities\") pod \"redhat-operators-r78w4\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.704172 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-utilities\") pod \"redhat-operators-r78w4\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.726954 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqm5h\" (UniqueName: \"kubernetes.io/projected/0d91352c-7639-49f0-baf7-bd343bb59c42-kube-api-access-vqm5h\") pod \"redhat-operators-r78w4\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:25 crc kubenswrapper[4873]: I0219 10:52:25.795847 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:26 crc kubenswrapper[4873]: I0219 10:52:26.356528 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r78w4"] Feb 19 10:52:26 crc kubenswrapper[4873]: I0219 10:52:26.714270 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r78w4" event={"ID":"0d91352c-7639-49f0-baf7-bd343bb59c42","Type":"ContainerStarted","Data":"6a0e3dd90bd9baa2e4af30ac07e6618b6ab8119c61a7e1c9c34b58b29982ab50"} Feb 19 10:52:27 crc kubenswrapper[4873]: I0219 10:52:27.728329 4873 generic.go:334] "Generic (PLEG): container finished" podID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerID="27af10342a7e1ad2491360a91681d932dd63d524dbab6ca3add1793c15d831ca" exitCode=0 Feb 19 10:52:27 crc kubenswrapper[4873]: I0219 10:52:27.728411 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r78w4" event={"ID":"0d91352c-7639-49f0-baf7-bd343bb59c42","Type":"ContainerDied","Data":"27af10342a7e1ad2491360a91681d932dd63d524dbab6ca3add1793c15d831ca"} Feb 19 10:52:29 crc kubenswrapper[4873]: I0219 10:52:29.760592 4873 generic.go:334] "Generic (PLEG): container finished" podID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerID="2e2552b2c4ecf36a5cf4a36c966c229ea32d862bd7b139e3133ee0c86e6b974d" exitCode=0 Feb 19 10:52:29 crc kubenswrapper[4873]: I0219 10:52:29.761008 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r78w4" event={"ID":"0d91352c-7639-49f0-baf7-bd343bb59c42","Type":"ContainerDied","Data":"2e2552b2c4ecf36a5cf4a36c966c229ea32d862bd7b139e3133ee0c86e6b974d"} Feb 19 10:52:30 crc kubenswrapper[4873]: I0219 10:52:30.773278 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r78w4" event={"ID":"0d91352c-7639-49f0-baf7-bd343bb59c42","Type":"ContainerStarted","Data":"c7b53d3544f9dd7da88afeb6fa02fe407f19ea68d37f09767e6132e62b14b454"} Feb 19 10:52:35 crc kubenswrapper[4873]: I0219 10:52:35.797052 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:35 crc kubenswrapper[4873]: I0219 10:52:35.797663 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.338957 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r78w4" podStartSLOduration=8.843319756 podStartE2EDuration="11.338935225s" podCreationTimestamp="2026-02-19 10:52:25 +0000 UTC" firstStartedPulling="2026-02-19 10:52:27.731766128 +0000 UTC m=+4057.021197766" lastFinishedPulling="2026-02-19 10:52:30.227381567 +0000 UTC m=+4059.516813235" observedRunningTime="2026-02-19 10:52:30.798677552 +0000 UTC m=+4060.088109230" watchObservedRunningTime="2026-02-19 10:52:36.338935225 +0000 UTC m=+4065.628366863" Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.353063 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xxf58"] Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.355991 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.363767 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxf58"] Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.546323 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hbvt\" (UniqueName: \"kubernetes.io/projected/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-kube-api-access-2hbvt\") pod \"redhat-marketplace-xxf58\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.546400 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-utilities\") pod \"redhat-marketplace-xxf58\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.547085 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-catalog-content\") pod \"redhat-marketplace-xxf58\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.649069 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-catalog-content\") pod \"redhat-marketplace-xxf58\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.649190 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hbvt\" (UniqueName: \"kubernetes.io/projected/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-kube-api-access-2hbvt\") pod \"redhat-marketplace-xxf58\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.649236 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-utilities\") pod \"redhat-marketplace-xxf58\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.649862 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-catalog-content\") pod \"redhat-marketplace-xxf58\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.651174 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-utilities\") pod \"redhat-marketplace-xxf58\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:36 crc kubenswrapper[4873]: I0219 10:52:36.872309 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r78w4" podUID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerName="registry-server" probeResult="failure" output=< Feb 19 10:52:36 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 10:52:36 crc kubenswrapper[4873]: > Feb 19 10:52:37 crc kubenswrapper[4873]: I0219 10:52:37.087052 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hbvt\" (UniqueName: \"kubernetes.io/projected/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-kube-api-access-2hbvt\") pod \"redhat-marketplace-xxf58\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:37 crc kubenswrapper[4873]: I0219 10:52:37.281176 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:37 crc kubenswrapper[4873]: I0219 10:52:37.823693 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxf58"] Feb 19 10:52:37 crc kubenswrapper[4873]: I0219 10:52:37.869350 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxf58" event={"ID":"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a","Type":"ContainerStarted","Data":"194a5c8ec3965dc48c4446e3c729e96059f10e97ac315dc79d6033eed5b46683"} Feb 19 10:52:38 crc kubenswrapper[4873]: I0219 10:52:38.881735 4873 generic.go:334] "Generic (PLEG): container finished" podID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" containerID="a13f60368c0055f1e74315a69214cbe8f6e44e0d47ad4202b97e5f3d097b12be" exitCode=0 Feb 19 10:52:38 crc kubenswrapper[4873]: I0219 10:52:38.881836 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxf58" event={"ID":"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a","Type":"ContainerDied","Data":"a13f60368c0055f1e74315a69214cbe8f6e44e0d47ad4202b97e5f3d097b12be"} Feb 19 10:52:40 crc kubenswrapper[4873]: I0219 10:52:40.902908 4873 generic.go:334] "Generic (PLEG): container finished" podID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" containerID="9576149c2a6edb3dd307caf70a679b5bcabf22e19412e9a41aab4e35d4aa0e9f" exitCode=0 Feb 19 10:52:40 crc kubenswrapper[4873]: I0219 10:52:40.903505 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxf58" event={"ID":"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a","Type":"ContainerDied","Data":"9576149c2a6edb3dd307caf70a679b5bcabf22e19412e9a41aab4e35d4aa0e9f"} Feb 19 10:52:42 crc kubenswrapper[4873]: I0219 10:52:42.946762 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxf58" event={"ID":"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a","Type":"ContainerStarted","Data":"6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4"} Feb 19 10:52:42 crc kubenswrapper[4873]: I0219 10:52:42.970471 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xxf58" podStartSLOduration=4.018271224 podStartE2EDuration="6.970450482s" podCreationTimestamp="2026-02-19 10:52:36 +0000 UTC" firstStartedPulling="2026-02-19 10:52:38.883913969 +0000 UTC m=+4068.173345607" lastFinishedPulling="2026-02-19 10:52:41.836093227 +0000 UTC m=+4071.125524865" observedRunningTime="2026-02-19 10:52:42.966235377 +0000 UTC m=+4072.255667025" watchObservedRunningTime="2026-02-19 10:52:42.970450482 +0000 UTC m=+4072.259882130" Feb 19 10:52:45 crc kubenswrapper[4873]: I0219 10:52:45.851391 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:45 crc kubenswrapper[4873]: I0219 10:52:45.905174 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:46 crc kubenswrapper[4873]: I0219 10:52:46.092691 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r78w4"] Feb 19 10:52:46 crc kubenswrapper[4873]: I0219 10:52:46.980146 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r78w4" podUID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerName="registry-server" containerID="cri-o://c7b53d3544f9dd7da88afeb6fa02fe407f19ea68d37f09767e6132e62b14b454" gracePeriod=2 Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:47.281349 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:47.282440 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:47.951980 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:47.995466 4873 generic.go:334] "Generic (PLEG): container finished" podID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerID="c7b53d3544f9dd7da88afeb6fa02fe407f19ea68d37f09767e6132e62b14b454" exitCode=0 Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:47.995611 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r78w4" event={"ID":"0d91352c-7639-49f0-baf7-bd343bb59c42","Type":"ContainerDied","Data":"c7b53d3544f9dd7da88afeb6fa02fe407f19ea68d37f09767e6132e62b14b454"} Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:47.995655 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r78w4" event={"ID":"0d91352c-7639-49f0-baf7-bd343bb59c42","Type":"ContainerDied","Data":"6a0e3dd90bd9baa2e4af30ac07e6618b6ab8119c61a7e1c9c34b58b29982ab50"} Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:47.995681 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a0e3dd90bd9baa2e4af30ac07e6618b6ab8119c61a7e1c9c34b58b29982ab50" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.003072 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.040326 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.096474 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-catalog-content\") pod \"0d91352c-7639-49f0-baf7-bd343bb59c42\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.096636 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-utilities\") pod \"0d91352c-7639-49f0-baf7-bd343bb59c42\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.098000 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-utilities" (OuterVolumeSpecName: "utilities") pod "0d91352c-7639-49f0-baf7-bd343bb59c42" (UID: "0d91352c-7639-49f0-baf7-bd343bb59c42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.098079 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqm5h\" (UniqueName: \"kubernetes.io/projected/0d91352c-7639-49f0-baf7-bd343bb59c42-kube-api-access-vqm5h\") pod \"0d91352c-7639-49f0-baf7-bd343bb59c42\" (UID: \"0d91352c-7639-49f0-baf7-bd343bb59c42\") " Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.099047 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.104942 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d91352c-7639-49f0-baf7-bd343bb59c42-kube-api-access-vqm5h" (OuterVolumeSpecName: "kube-api-access-vqm5h") pod "0d91352c-7639-49f0-baf7-bd343bb59c42" (UID: "0d91352c-7639-49f0-baf7-bd343bb59c42"). InnerVolumeSpecName "kube-api-access-vqm5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.200910 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqm5h\" (UniqueName: \"kubernetes.io/projected/0d91352c-7639-49f0-baf7-bd343bb59c42-kube-api-access-vqm5h\") on node \"crc\" DevicePath \"\"" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.235878 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d91352c-7639-49f0-baf7-bd343bb59c42" (UID: "0d91352c-7639-49f0-baf7-bd343bb59c42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.240506 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.240567 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.302609 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d91352c-7639-49f0-baf7-bd343bb59c42-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:52:48 crc kubenswrapper[4873]: I0219 10:52:48.918040 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxf58"] Feb 19 10:52:49 crc kubenswrapper[4873]: I0219 10:52:49.003628 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r78w4" Feb 19 10:52:49 crc kubenswrapper[4873]: I0219 10:52:49.056355 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r78w4"] Feb 19 10:52:49 crc kubenswrapper[4873]: I0219 10:52:49.066126 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r78w4"] Feb 19 10:52:49 crc kubenswrapper[4873]: I0219 10:52:49.494827 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d91352c-7639-49f0-baf7-bd343bb59c42" path="/var/lib/kubelet/pods/0d91352c-7639-49f0-baf7-bd343bb59c42/volumes" Feb 19 10:52:50 crc kubenswrapper[4873]: I0219 10:52:50.011471 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xxf58" podUID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" containerName="registry-server" containerID="cri-o://6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4" gracePeriod=2 Feb 19 10:52:50 crc kubenswrapper[4873]: I0219 10:52:50.530856 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:50 crc kubenswrapper[4873]: I0219 10:52:50.649969 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-utilities\") pod \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " Feb 19 10:52:50 crc kubenswrapper[4873]: I0219 10:52:50.650050 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hbvt\" (UniqueName: \"kubernetes.io/projected/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-kube-api-access-2hbvt\") pod \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " Feb 19 10:52:50 crc kubenswrapper[4873]: I0219 10:52:50.650124 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-catalog-content\") pod \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\" (UID: \"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a\") " Feb 19 10:52:50 crc kubenswrapper[4873]: I0219 10:52:50.651000 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-utilities" (OuterVolumeSpecName: "utilities") pod "8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" (UID: "8f4dc211-905d-41b4-bb4a-3f7a61ddc43a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:52:50 crc kubenswrapper[4873]: I0219 10:52:50.658123 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-kube-api-access-2hbvt" (OuterVolumeSpecName: "kube-api-access-2hbvt") pod "8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" (UID: "8f4dc211-905d-41b4-bb4a-3f7a61ddc43a"). InnerVolumeSpecName "kube-api-access-2hbvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:52:50 crc kubenswrapper[4873]: I0219 10:52:50.752664 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:52:50 crc kubenswrapper[4873]: I0219 10:52:50.752693 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hbvt\" (UniqueName: \"kubernetes.io/projected/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-kube-api-access-2hbvt\") on node \"crc\" DevicePath \"\"" Feb 19 10:52:50 crc kubenswrapper[4873]: I0219 10:52:50.792947 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" (UID: "8f4dc211-905d-41b4-bb4a-3f7a61ddc43a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:52:50 crc kubenswrapper[4873]: I0219 10:52:50.854188 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.023649 4873 generic.go:334] "Generic (PLEG): container finished" podID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" containerID="6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4" exitCode=0 Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.023691 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxf58" event={"ID":"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a","Type":"ContainerDied","Data":"6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4"} Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.023717 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxf58" event={"ID":"8f4dc211-905d-41b4-bb4a-3f7a61ddc43a","Type":"ContainerDied","Data":"194a5c8ec3965dc48c4446e3c729e96059f10e97ac315dc79d6033eed5b46683"} Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.023734 4873 scope.go:117] "RemoveContainer" containerID="6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4" Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.023751 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxf58" Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.054982 4873 scope.go:117] "RemoveContainer" containerID="9576149c2a6edb3dd307caf70a679b5bcabf22e19412e9a41aab4e35d4aa0e9f" Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.060916 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxf58"] Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.069757 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxf58"] Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.078904 4873 scope.go:117] "RemoveContainer" containerID="a13f60368c0055f1e74315a69214cbe8f6e44e0d47ad4202b97e5f3d097b12be" Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.134657 4873 scope.go:117] "RemoveContainer" containerID="6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4" Feb 19 10:52:51 crc kubenswrapper[4873]: E0219 10:52:51.135293 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4\": container with ID starting with 6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4 not found: ID does not exist" containerID="6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4" Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.135350 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4"} err="failed to get container status \"6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4\": rpc error: code = NotFound desc = could not find container \"6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4\": container with ID starting with 6f2919947fabd994e41a4657c0c39dc555238ed077e615ca386b18a394515ac4 not found: ID does not exist" Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.135386 4873 scope.go:117] "RemoveContainer" containerID="9576149c2a6edb3dd307caf70a679b5bcabf22e19412e9a41aab4e35d4aa0e9f" Feb 19 10:52:51 crc kubenswrapper[4873]: E0219 10:52:51.136574 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9576149c2a6edb3dd307caf70a679b5bcabf22e19412e9a41aab4e35d4aa0e9f\": container with ID starting with 9576149c2a6edb3dd307caf70a679b5bcabf22e19412e9a41aab4e35d4aa0e9f not found: ID does not exist" containerID="9576149c2a6edb3dd307caf70a679b5bcabf22e19412e9a41aab4e35d4aa0e9f" Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.136600 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9576149c2a6edb3dd307caf70a679b5bcabf22e19412e9a41aab4e35d4aa0e9f"} err="failed to get container status \"9576149c2a6edb3dd307caf70a679b5bcabf22e19412e9a41aab4e35d4aa0e9f\": rpc error: code = NotFound desc = could not find container \"9576149c2a6edb3dd307caf70a679b5bcabf22e19412e9a41aab4e35d4aa0e9f\": container with ID starting with 9576149c2a6edb3dd307caf70a679b5bcabf22e19412e9a41aab4e35d4aa0e9f not found: ID does not exist" Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.136622 4873 scope.go:117] "RemoveContainer" containerID="a13f60368c0055f1e74315a69214cbe8f6e44e0d47ad4202b97e5f3d097b12be" Feb 19 10:52:51 crc kubenswrapper[4873]: E0219 10:52:51.136992 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a13f60368c0055f1e74315a69214cbe8f6e44e0d47ad4202b97e5f3d097b12be\": container with ID starting with a13f60368c0055f1e74315a69214cbe8f6e44e0d47ad4202b97e5f3d097b12be not found: ID does not exist" containerID="a13f60368c0055f1e74315a69214cbe8f6e44e0d47ad4202b97e5f3d097b12be" Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.137036 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a13f60368c0055f1e74315a69214cbe8f6e44e0d47ad4202b97e5f3d097b12be"} err="failed to get container status \"a13f60368c0055f1e74315a69214cbe8f6e44e0d47ad4202b97e5f3d097b12be\": rpc error: code = NotFound desc = could not find container \"a13f60368c0055f1e74315a69214cbe8f6e44e0d47ad4202b97e5f3d097b12be\": container with ID starting with a13f60368c0055f1e74315a69214cbe8f6e44e0d47ad4202b97e5f3d097b12be not found: ID does not exist" Feb 19 10:52:51 crc kubenswrapper[4873]: I0219 10:52:51.498090 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" path="/var/lib/kubelet/pods/8f4dc211-905d-41b4-bb4a-3f7a61ddc43a/volumes" Feb 19 10:53:18 crc kubenswrapper[4873]: I0219 10:53:18.241431 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:53:18 crc kubenswrapper[4873]: I0219 10:53:18.241964 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:53:18 crc kubenswrapper[4873]: I0219 10:53:18.242016 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:53:18 crc kubenswrapper[4873]: I0219 10:53:18.242838 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f289291e3d6f52c31a0d326462313d44367b683964d6ef342d209f000362ec9"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:53:18 crc kubenswrapper[4873]: I0219 10:53:18.243009 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://3f289291e3d6f52c31a0d326462313d44367b683964d6ef342d209f000362ec9" gracePeriod=600 Feb 19 10:53:19 crc kubenswrapper[4873]: I0219 10:53:19.283248 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="3f289291e3d6f52c31a0d326462313d44367b683964d6ef342d209f000362ec9" exitCode=0 Feb 19 10:53:19 crc kubenswrapper[4873]: I0219 10:53:19.283341 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"3f289291e3d6f52c31a0d326462313d44367b683964d6ef342d209f000362ec9"} Feb 19 10:53:19 crc kubenswrapper[4873]: I0219 10:53:19.283529 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de"} Feb 19 10:53:19 crc kubenswrapper[4873]: I0219 10:53:19.283550 4873 scope.go:117] "RemoveContainer" containerID="ea511bbcc48093928a25a3a4b468255a7f9fbafc2550fa8fd49de20ec45c97a4" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.393257 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gq42m"] Feb 19 10:53:43 crc kubenswrapper[4873]: E0219 10:53:43.394396 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerName="extract-content" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.394414 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerName="extract-content" Feb 19 10:53:43 crc kubenswrapper[4873]: E0219 10:53:43.394430 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" containerName="registry-server" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.394436 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" containerName="registry-server" Feb 19 10:53:43 crc kubenswrapper[4873]: E0219 10:53:43.394446 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" containerName="extract-content" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.394452 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" containerName="extract-content" Feb 19 10:53:43 crc kubenswrapper[4873]: E0219 10:53:43.394462 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerName="registry-server" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.394468 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerName="registry-server" Feb 19 10:53:43 crc kubenswrapper[4873]: E0219 10:53:43.394491 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerName="extract-utilities" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.394498 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerName="extract-utilities" Feb 19 10:53:43 crc kubenswrapper[4873]: E0219 10:53:43.394522 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" containerName="extract-utilities" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.394528 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" containerName="extract-utilities" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.394705 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d91352c-7639-49f0-baf7-bd343bb59c42" containerName="registry-server" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.394725 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4dc211-905d-41b4-bb4a-3f7a61ddc43a" containerName="registry-server" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.396355 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.412563 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gq42m"] Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.462799 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-utilities\") pod \"community-operators-gq42m\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.462879 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-catalog-content\") pod \"community-operators-gq42m\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.463125 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlv7g\" (UniqueName: \"kubernetes.io/projected/adb060ed-98a8-4d81-820a-8e2d26500534-kube-api-access-vlv7g\") pod \"community-operators-gq42m\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.564904 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlv7g\" (UniqueName: \"kubernetes.io/projected/adb060ed-98a8-4d81-820a-8e2d26500534-kube-api-access-vlv7g\") pod \"community-operators-gq42m\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.566181 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-utilities\") pod \"community-operators-gq42m\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.566350 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-catalog-content\") pod \"community-operators-gq42m\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.567046 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-catalog-content\") pod \"community-operators-gq42m\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.567154 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-utilities\") pod \"community-operators-gq42m\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.604870 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlv7g\" (UniqueName: \"kubernetes.io/projected/adb060ed-98a8-4d81-820a-8e2d26500534-kube-api-access-vlv7g\") pod \"community-operators-gq42m\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:43 crc kubenswrapper[4873]: I0219 10:53:43.728457 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:44 crc kubenswrapper[4873]: I0219 10:53:44.273524 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gq42m"] Feb 19 10:53:44 crc kubenswrapper[4873]: I0219 10:53:44.527582 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq42m" event={"ID":"adb060ed-98a8-4d81-820a-8e2d26500534","Type":"ContainerStarted","Data":"00e9be978a5058cfb41fc6f67a24942030ba394962d3935402b0a652ff531cd7"} Feb 19 10:53:45 crc kubenswrapper[4873]: I0219 10:53:45.542687 4873 generic.go:334] "Generic (PLEG): container finished" podID="adb060ed-98a8-4d81-820a-8e2d26500534" containerID="67c60116f380dfee9023370eececf0d5081e88cdb702860b7f01a8ec1aef69cc" exitCode=0 Feb 19 10:53:45 crc kubenswrapper[4873]: I0219 10:53:45.542923 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq42m" event={"ID":"adb060ed-98a8-4d81-820a-8e2d26500534","Type":"ContainerDied","Data":"67c60116f380dfee9023370eececf0d5081e88cdb702860b7f01a8ec1aef69cc"} Feb 19 10:53:45 crc kubenswrapper[4873]: I0219 10:53:45.545353 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 10:53:47 crc kubenswrapper[4873]: I0219 10:53:47.564057 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq42m" event={"ID":"adb060ed-98a8-4d81-820a-8e2d26500534","Type":"ContainerStarted","Data":"0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728"} Feb 19 10:53:48 crc kubenswrapper[4873]: I0219 10:53:48.575663 4873 generic.go:334] "Generic (PLEG): container finished" podID="adb060ed-98a8-4d81-820a-8e2d26500534" containerID="0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728" exitCode=0 Feb 19 10:53:48 crc kubenswrapper[4873]: I0219 10:53:48.575727 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq42m" event={"ID":"adb060ed-98a8-4d81-820a-8e2d26500534","Type":"ContainerDied","Data":"0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728"} Feb 19 10:53:49 crc kubenswrapper[4873]: I0219 10:53:49.590590 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq42m" event={"ID":"adb060ed-98a8-4d81-820a-8e2d26500534","Type":"ContainerStarted","Data":"a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e"} Feb 19 10:53:49 crc kubenswrapper[4873]: I0219 10:53:49.619983 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gq42m" podStartSLOduration=3.121397811 podStartE2EDuration="6.619965667s" podCreationTimestamp="2026-02-19 10:53:43 +0000 UTC" firstStartedPulling="2026-02-19 10:53:45.545148538 +0000 UTC m=+4134.834580166" lastFinishedPulling="2026-02-19 10:53:49.043716384 +0000 UTC m=+4138.333148022" observedRunningTime="2026-02-19 10:53:49.610680463 +0000 UTC m=+4138.900112111" watchObservedRunningTime="2026-02-19 10:53:49.619965667 +0000 UTC m=+4138.909397305" Feb 19 10:53:53 crc kubenswrapper[4873]: I0219 10:53:53.730057 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:53 crc kubenswrapper[4873]: I0219 10:53:53.730739 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:53 crc kubenswrapper[4873]: I0219 10:53:53.788321 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:54 crc kubenswrapper[4873]: I0219 10:53:54.698713 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:55 crc kubenswrapper[4873]: I0219 10:53:55.980648 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gq42m"] Feb 19 10:53:56 crc kubenswrapper[4873]: I0219 10:53:56.658891 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gq42m" podUID="adb060ed-98a8-4d81-820a-8e2d26500534" containerName="registry-server" containerID="cri-o://a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e" gracePeriod=2 Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.204989 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.282471 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-utilities\") pod \"adb060ed-98a8-4d81-820a-8e2d26500534\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.282618 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-catalog-content\") pod \"adb060ed-98a8-4d81-820a-8e2d26500534\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.282644 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlv7g\" (UniqueName: \"kubernetes.io/projected/adb060ed-98a8-4d81-820a-8e2d26500534-kube-api-access-vlv7g\") pod \"adb060ed-98a8-4d81-820a-8e2d26500534\" (UID: \"adb060ed-98a8-4d81-820a-8e2d26500534\") " Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.283481 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-utilities" (OuterVolumeSpecName: "utilities") pod "adb060ed-98a8-4d81-820a-8e2d26500534" (UID: "adb060ed-98a8-4d81-820a-8e2d26500534"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.297954 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb060ed-98a8-4d81-820a-8e2d26500534-kube-api-access-vlv7g" (OuterVolumeSpecName: "kube-api-access-vlv7g") pod "adb060ed-98a8-4d81-820a-8e2d26500534" (UID: "adb060ed-98a8-4d81-820a-8e2d26500534"). InnerVolumeSpecName "kube-api-access-vlv7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.348630 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adb060ed-98a8-4d81-820a-8e2d26500534" (UID: "adb060ed-98a8-4d81-820a-8e2d26500534"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.385508 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.385562 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlv7g\" (UniqueName: \"kubernetes.io/projected/adb060ed-98a8-4d81-820a-8e2d26500534-kube-api-access-vlv7g\") on node \"crc\" DevicePath \"\"" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.385578 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb060ed-98a8-4d81-820a-8e2d26500534-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.671494 4873 generic.go:334] "Generic (PLEG): container finished" podID="adb060ed-98a8-4d81-820a-8e2d26500534" containerID="a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e" exitCode=0 Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.671864 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq42m" event={"ID":"adb060ed-98a8-4d81-820a-8e2d26500534","Type":"ContainerDied","Data":"a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e"} Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.671898 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq42m" event={"ID":"adb060ed-98a8-4d81-820a-8e2d26500534","Type":"ContainerDied","Data":"00e9be978a5058cfb41fc6f67a24942030ba394962d3935402b0a652ff531cd7"} Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.671919 4873 scope.go:117] "RemoveContainer" containerID="a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.672115 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gq42m" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.700200 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gq42m"] Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.703870 4873 scope.go:117] "RemoveContainer" containerID="0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.710524 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gq42m"] Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.730387 4873 scope.go:117] "RemoveContainer" containerID="67c60116f380dfee9023370eececf0d5081e88cdb702860b7f01a8ec1aef69cc" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.794036 4873 scope.go:117] "RemoveContainer" containerID="a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e" Feb 19 10:53:57 crc kubenswrapper[4873]: E0219 10:53:57.795528 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e\": container with ID starting with a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e not found: ID does not exist" containerID="a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.795932 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e"} err="failed to get container status \"a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e\": rpc error: code = NotFound desc = could not find container \"a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e\": container with ID starting with a212603423738414e050cdf1f902d1187dfb145aa2651df5b68ad1f242b3b09e not found: ID does not exist" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.796057 4873 scope.go:117] "RemoveContainer" containerID="0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728" Feb 19 10:53:57 crc kubenswrapper[4873]: E0219 10:53:57.799981 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728\": container with ID starting with 0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728 not found: ID does not exist" containerID="0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.800035 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728"} err="failed to get container status \"0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728\": rpc error: code = NotFound desc = could not find container \"0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728\": container with ID starting with 0f41fd460e2c60d591b7e1631be9e225edf447e4059e806496eb7983462df728 not found: ID does not exist" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.800065 4873 scope.go:117] "RemoveContainer" containerID="67c60116f380dfee9023370eececf0d5081e88cdb702860b7f01a8ec1aef69cc" Feb 19 10:53:57 crc kubenswrapper[4873]: E0219 10:53:57.800519 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c60116f380dfee9023370eececf0d5081e88cdb702860b7f01a8ec1aef69cc\": container with ID starting with 67c60116f380dfee9023370eececf0d5081e88cdb702860b7f01a8ec1aef69cc not found: ID does not exist" containerID="67c60116f380dfee9023370eececf0d5081e88cdb702860b7f01a8ec1aef69cc" Feb 19 10:53:57 crc kubenswrapper[4873]: I0219 10:53:57.800568 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c60116f380dfee9023370eececf0d5081e88cdb702860b7f01a8ec1aef69cc"} err="failed to get container status \"67c60116f380dfee9023370eececf0d5081e88cdb702860b7f01a8ec1aef69cc\": rpc error: code = NotFound desc = could not find container \"67c60116f380dfee9023370eececf0d5081e88cdb702860b7f01a8ec1aef69cc\": container with ID starting with 67c60116f380dfee9023370eececf0d5081e88cdb702860b7f01a8ec1aef69cc not found: ID does not exist" Feb 19 10:53:59 crc kubenswrapper[4873]: I0219 10:53:59.496342 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adb060ed-98a8-4d81-820a-8e2d26500534" path="/var/lib/kubelet/pods/adb060ed-98a8-4d81-820a-8e2d26500534/volumes" Feb 19 10:55:48 crc kubenswrapper[4873]: I0219 10:55:48.239963 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:55:48 crc kubenswrapper[4873]: I0219 10:55:48.240670 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:56:18 crc kubenswrapper[4873]: I0219 10:56:18.240339 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:56:18 crc kubenswrapper[4873]: I0219 10:56:18.240895 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:56:48 crc kubenswrapper[4873]: I0219 10:56:48.240543 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 10:56:48 crc kubenswrapper[4873]: I0219 10:56:48.240994 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 10:56:48 crc kubenswrapper[4873]: I0219 10:56:48.241035 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 10:56:48 crc kubenswrapper[4873]: I0219 10:56:48.241837 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 10:56:48 crc kubenswrapper[4873]: I0219 10:56:48.241895 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" gracePeriod=600 Feb 19 10:56:48 crc kubenswrapper[4873]: E0219 10:56:48.424835 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:56:49 crc kubenswrapper[4873]: I0219 10:56:49.380333 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" exitCode=0 Feb 19 10:56:49 crc kubenswrapper[4873]: I0219 10:56:49.380376 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de"} Feb 19 10:56:49 crc kubenswrapper[4873]: I0219 10:56:49.380408 4873 scope.go:117] "RemoveContainer" containerID="3f289291e3d6f52c31a0d326462313d44367b683964d6ef342d209f000362ec9" Feb 19 10:56:49 crc kubenswrapper[4873]: I0219 10:56:49.381134 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:56:49 crc kubenswrapper[4873]: E0219 10:56:49.381547 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:57:00 crc kubenswrapper[4873]: I0219 10:57:00.484646 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:57:00 crc kubenswrapper[4873]: E0219 10:57:00.485448 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:57:13 crc kubenswrapper[4873]: I0219 10:57:13.484764 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:57:13 crc kubenswrapper[4873]: E0219 10:57:13.486997 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:57:24 crc kubenswrapper[4873]: I0219 10:57:24.484030 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:57:24 crc kubenswrapper[4873]: E0219 10:57:24.484874 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:57:38 crc kubenswrapper[4873]: I0219 10:57:38.484437 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:57:38 crc kubenswrapper[4873]: E0219 10:57:38.485246 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:57:53 crc kubenswrapper[4873]: I0219 10:57:53.484765 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:57:53 crc kubenswrapper[4873]: E0219 10:57:53.485814 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:58:06 crc kubenswrapper[4873]: I0219 10:58:06.484640 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:58:06 crc kubenswrapper[4873]: E0219 10:58:06.485398 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:58:18 crc kubenswrapper[4873]: I0219 10:58:18.485138 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:58:18 crc kubenswrapper[4873]: E0219 10:58:18.486082 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:58:31 crc kubenswrapper[4873]: I0219 10:58:31.493713 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:58:31 crc kubenswrapper[4873]: E0219 10:58:31.494455 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:58:46 crc kubenswrapper[4873]: I0219 10:58:46.485236 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:58:46 crc kubenswrapper[4873]: E0219 10:58:46.486746 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:59:01 crc kubenswrapper[4873]: I0219 10:59:01.495931 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:59:01 crc kubenswrapper[4873]: E0219 10:59:01.496872 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:59:03 crc kubenswrapper[4873]: I0219 10:59:03.011951 4873 scope.go:117] "RemoveContainer" containerID="2e2552b2c4ecf36a5cf4a36c966c229ea32d862bd7b139e3133ee0c86e6b974d" Feb 19 10:59:03 crc kubenswrapper[4873]: I0219 10:59:03.057504 4873 scope.go:117] "RemoveContainer" containerID="c7b53d3544f9dd7da88afeb6fa02fe407f19ea68d37f09767e6132e62b14b454" Feb 19 10:59:03 crc kubenswrapper[4873]: I0219 10:59:03.135842 4873 scope.go:117] "RemoveContainer" containerID="27af10342a7e1ad2491360a91681d932dd63d524dbab6ca3add1793c15d831ca" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.751380 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6vtrc"] Feb 19 10:59:10 crc kubenswrapper[4873]: E0219 10:59:10.752300 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb060ed-98a8-4d81-820a-8e2d26500534" containerName="extract-content" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.752312 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb060ed-98a8-4d81-820a-8e2d26500534" containerName="extract-content" Feb 19 10:59:10 crc kubenswrapper[4873]: E0219 10:59:10.752324 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb060ed-98a8-4d81-820a-8e2d26500534" containerName="registry-server" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.752330 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb060ed-98a8-4d81-820a-8e2d26500534" containerName="registry-server" Feb 19 10:59:10 crc kubenswrapper[4873]: E0219 10:59:10.752340 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb060ed-98a8-4d81-820a-8e2d26500534" containerName="extract-utilities" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.752347 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb060ed-98a8-4d81-820a-8e2d26500534" containerName="extract-utilities" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.752524 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="adb060ed-98a8-4d81-820a-8e2d26500534" containerName="registry-server" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.755528 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.770195 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-utilities\") pod \"certified-operators-6vtrc\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.770361 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gkcz\" (UniqueName: \"kubernetes.io/projected/dff237fa-c66c-4108-8f98-e737561ed9ff-kube-api-access-6gkcz\") pod \"certified-operators-6vtrc\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.770507 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-catalog-content\") pod \"certified-operators-6vtrc\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.785335 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vtrc"] Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.872431 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-utilities\") pod \"certified-operators-6vtrc\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.872536 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gkcz\" (UniqueName: \"kubernetes.io/projected/dff237fa-c66c-4108-8f98-e737561ed9ff-kube-api-access-6gkcz\") pod \"certified-operators-6vtrc\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.872592 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-catalog-content\") pod \"certified-operators-6vtrc\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.873062 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-utilities\") pod \"certified-operators-6vtrc\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.873237 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-catalog-content\") pod \"certified-operators-6vtrc\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:10 crc kubenswrapper[4873]: I0219 10:59:10.894089 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gkcz\" (UniqueName: \"kubernetes.io/projected/dff237fa-c66c-4108-8f98-e737561ed9ff-kube-api-access-6gkcz\") pod \"certified-operators-6vtrc\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:11 crc kubenswrapper[4873]: I0219 10:59:11.077645 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:11 crc kubenswrapper[4873]: I0219 10:59:11.623788 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vtrc"] Feb 19 10:59:11 crc kubenswrapper[4873]: W0219 10:59:11.628248 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff237fa_c66c_4108_8f98_e737561ed9ff.slice/crio-eed49f58dc087c3b47898a9bc1ffb50a29782323277a5b9db0fc3ffae7be28e4 WatchSource:0}: Error finding container eed49f58dc087c3b47898a9bc1ffb50a29782323277a5b9db0fc3ffae7be28e4: Status 404 returned error can't find the container with id eed49f58dc087c3b47898a9bc1ffb50a29782323277a5b9db0fc3ffae7be28e4 Feb 19 10:59:11 crc kubenswrapper[4873]: I0219 10:59:11.881333 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vtrc" event={"ID":"dff237fa-c66c-4108-8f98-e737561ed9ff","Type":"ContainerStarted","Data":"eed49f58dc087c3b47898a9bc1ffb50a29782323277a5b9db0fc3ffae7be28e4"} Feb 19 10:59:12 crc kubenswrapper[4873]: I0219 10:59:12.900367 4873 generic.go:334] "Generic (PLEG): container finished" podID="dff237fa-c66c-4108-8f98-e737561ed9ff" containerID="7a3f863c5402ebc57d8a4349326c3916d6c7005e837fa8f8e791d46ad913d630" exitCode=0 Feb 19 10:59:12 crc kubenswrapper[4873]: I0219 10:59:12.900922 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vtrc" event={"ID":"dff237fa-c66c-4108-8f98-e737561ed9ff","Type":"ContainerDied","Data":"7a3f863c5402ebc57d8a4349326c3916d6c7005e837fa8f8e791d46ad913d630"} Feb 19 10:59:12 crc kubenswrapper[4873]: I0219 10:59:12.904575 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 10:59:13 crc kubenswrapper[4873]: I0219 10:59:13.910776 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vtrc" event={"ID":"dff237fa-c66c-4108-8f98-e737561ed9ff","Type":"ContainerStarted","Data":"984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338"} Feb 19 10:59:15 crc kubenswrapper[4873]: I0219 10:59:15.484808 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:59:15 crc kubenswrapper[4873]: E0219 10:59:15.485356 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:59:15 crc kubenswrapper[4873]: I0219 10:59:15.930548 4873 generic.go:334] "Generic (PLEG): container finished" podID="dff237fa-c66c-4108-8f98-e737561ed9ff" containerID="984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338" exitCode=0 Feb 19 10:59:15 crc kubenswrapper[4873]: I0219 10:59:15.930595 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vtrc" event={"ID":"dff237fa-c66c-4108-8f98-e737561ed9ff","Type":"ContainerDied","Data":"984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338"} Feb 19 10:59:16 crc kubenswrapper[4873]: I0219 10:59:16.945720 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vtrc" event={"ID":"dff237fa-c66c-4108-8f98-e737561ed9ff","Type":"ContainerStarted","Data":"24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5"} Feb 19 10:59:16 crc kubenswrapper[4873]: I0219 10:59:16.997282 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6vtrc" podStartSLOduration=3.335551627 podStartE2EDuration="6.997261621s" podCreationTimestamp="2026-02-19 10:59:10 +0000 UTC" firstStartedPulling="2026-02-19 10:59:12.904242573 +0000 UTC m=+4462.193674211" lastFinishedPulling="2026-02-19 10:59:16.565952567 +0000 UTC m=+4465.855384205" observedRunningTime="2026-02-19 10:59:16.97370637 +0000 UTC m=+4466.263138008" watchObservedRunningTime="2026-02-19 10:59:16.997261621 +0000 UTC m=+4466.286693249" Feb 19 10:59:21 crc kubenswrapper[4873]: I0219 10:59:21.077960 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:21 crc kubenswrapper[4873]: I0219 10:59:21.078609 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:21 crc kubenswrapper[4873]: I0219 10:59:21.163027 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:22 crc kubenswrapper[4873]: I0219 10:59:22.056213 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:22 crc kubenswrapper[4873]: I0219 10:59:22.104789 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vtrc"] Feb 19 10:59:24 crc kubenswrapper[4873]: I0219 10:59:24.024790 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6vtrc" podUID="dff237fa-c66c-4108-8f98-e737561ed9ff" containerName="registry-server" containerID="cri-o://24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5" gracePeriod=2 Feb 19 10:59:24 crc kubenswrapper[4873]: I0219 10:59:24.540636 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:24 crc kubenswrapper[4873]: I0219 10:59:24.567774 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-catalog-content\") pod \"dff237fa-c66c-4108-8f98-e737561ed9ff\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " Feb 19 10:59:24 crc kubenswrapper[4873]: I0219 10:59:24.636620 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dff237fa-c66c-4108-8f98-e737561ed9ff" (UID: "dff237fa-c66c-4108-8f98-e737561ed9ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:59:24 crc kubenswrapper[4873]: I0219 10:59:24.669443 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-utilities\") pod \"dff237fa-c66c-4108-8f98-e737561ed9ff\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " Feb 19 10:59:24 crc kubenswrapper[4873]: I0219 10:59:24.671064 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-utilities" (OuterVolumeSpecName: "utilities") pod "dff237fa-c66c-4108-8f98-e737561ed9ff" (UID: "dff237fa-c66c-4108-8f98-e737561ed9ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 10:59:24 crc kubenswrapper[4873]: I0219 10:59:24.672000 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gkcz\" (UniqueName: \"kubernetes.io/projected/dff237fa-c66c-4108-8f98-e737561ed9ff-kube-api-access-6gkcz\") pod \"dff237fa-c66c-4108-8f98-e737561ed9ff\" (UID: \"dff237fa-c66c-4108-8f98-e737561ed9ff\") " Feb 19 10:59:24 crc kubenswrapper[4873]: I0219 10:59:24.672917 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 10:59:24 crc kubenswrapper[4873]: I0219 10:59:24.673141 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff237fa-c66c-4108-8f98-e737561ed9ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 10:59:24 crc kubenswrapper[4873]: I0219 10:59:24.678342 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff237fa-c66c-4108-8f98-e737561ed9ff-kube-api-access-6gkcz" (OuterVolumeSpecName: "kube-api-access-6gkcz") pod "dff237fa-c66c-4108-8f98-e737561ed9ff" (UID: "dff237fa-c66c-4108-8f98-e737561ed9ff"). InnerVolumeSpecName "kube-api-access-6gkcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 10:59:24 crc kubenswrapper[4873]: I0219 10:59:24.775051 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gkcz\" (UniqueName: \"kubernetes.io/projected/dff237fa-c66c-4108-8f98-e737561ed9ff-kube-api-access-6gkcz\") on node \"crc\" DevicePath \"\"" Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.036786 4873 generic.go:334] "Generic (PLEG): container finished" podID="dff237fa-c66c-4108-8f98-e737561ed9ff" containerID="24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5" exitCode=0 Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.036844 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vtrc" event={"ID":"dff237fa-c66c-4108-8f98-e737561ed9ff","Type":"ContainerDied","Data":"24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5"} Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.038376 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vtrc" event={"ID":"dff237fa-c66c-4108-8f98-e737561ed9ff","Type":"ContainerDied","Data":"eed49f58dc087c3b47898a9bc1ffb50a29782323277a5b9db0fc3ffae7be28e4"} Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.038423 4873 scope.go:117] "RemoveContainer" containerID="24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5" Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.036873 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vtrc" Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.065804 4873 scope.go:117] "RemoveContainer" containerID="984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338" Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.090615 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vtrc"] Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.105874 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6vtrc"] Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.115818 4873 scope.go:117] "RemoveContainer" containerID="7a3f863c5402ebc57d8a4349326c3916d6c7005e837fa8f8e791d46ad913d630" Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.158557 4873 scope.go:117] "RemoveContainer" containerID="24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5" Feb 19 10:59:25 crc kubenswrapper[4873]: E0219 10:59:25.159035 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5\": container with ID starting with 24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5 not found: ID does not exist" containerID="24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5" Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.159075 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5"} err="failed to get container status \"24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5\": rpc error: code = NotFound desc = could not find container \"24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5\": container with ID starting with 24f7cc13e93140d4caec71be495e460686bd5d48b280fafe5a97dea3ac447fa5 not found: ID does not exist" Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.159121 4873 scope.go:117] "RemoveContainer" containerID="984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338" Feb 19 10:59:25 crc kubenswrapper[4873]: E0219 10:59:25.159525 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338\": container with ID starting with 984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338 not found: ID does not exist" containerID="984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338" Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.159546 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338"} err="failed to get container status \"984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338\": rpc error: code = NotFound desc = could not find container \"984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338\": container with ID starting with 984fe396a2f38d658124a2949cbf8d81c266e54ed3cdf8e85e387c0940b3d338 not found: ID does not exist" Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.159558 4873 scope.go:117] "RemoveContainer" containerID="7a3f863c5402ebc57d8a4349326c3916d6c7005e837fa8f8e791d46ad913d630" Feb 19 10:59:25 crc kubenswrapper[4873]: E0219 10:59:25.159927 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a3f863c5402ebc57d8a4349326c3916d6c7005e837fa8f8e791d46ad913d630\": container with ID starting with 7a3f863c5402ebc57d8a4349326c3916d6c7005e837fa8f8e791d46ad913d630 not found: ID does not exist" containerID="7a3f863c5402ebc57d8a4349326c3916d6c7005e837fa8f8e791d46ad913d630" Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.159952 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a3f863c5402ebc57d8a4349326c3916d6c7005e837fa8f8e791d46ad913d630"} err="failed to get container status \"7a3f863c5402ebc57d8a4349326c3916d6c7005e837fa8f8e791d46ad913d630\": rpc error: code = NotFound desc = could not find container \"7a3f863c5402ebc57d8a4349326c3916d6c7005e837fa8f8e791d46ad913d630\": container with ID starting with 7a3f863c5402ebc57d8a4349326c3916d6c7005e837fa8f8e791d46ad913d630 not found: ID does not exist" Feb 19 10:59:25 crc kubenswrapper[4873]: I0219 10:59:25.496851 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff237fa-c66c-4108-8f98-e737561ed9ff" path="/var/lib/kubelet/pods/dff237fa-c66c-4108-8f98-e737561ed9ff/volumes" Feb 19 10:59:28 crc kubenswrapper[4873]: I0219 10:59:28.485491 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:59:28 crc kubenswrapper[4873]: E0219 10:59:28.488419 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:59:42 crc kubenswrapper[4873]: I0219 10:59:42.484563 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:59:42 crc kubenswrapper[4873]: E0219 10:59:42.485290 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 10:59:53 crc kubenswrapper[4873]: I0219 10:59:53.483920 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 10:59:53 crc kubenswrapper[4873]: E0219 10:59:53.484720 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.183787 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv"] Feb 19 11:00:00 crc kubenswrapper[4873]: E0219 11:00:00.184807 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff237fa-c66c-4108-8f98-e737561ed9ff" containerName="registry-server" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.184821 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff237fa-c66c-4108-8f98-e737561ed9ff" containerName="registry-server" Feb 19 11:00:00 crc kubenswrapper[4873]: E0219 11:00:00.184850 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff237fa-c66c-4108-8f98-e737561ed9ff" containerName="extract-utilities" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.184857 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff237fa-c66c-4108-8f98-e737561ed9ff" containerName="extract-utilities" Feb 19 11:00:00 crc kubenswrapper[4873]: E0219 11:00:00.184869 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff237fa-c66c-4108-8f98-e737561ed9ff" containerName="extract-content" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.184875 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff237fa-c66c-4108-8f98-e737561ed9ff" containerName="extract-content" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.185069 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff237fa-c66c-4108-8f98-e737561ed9ff" containerName="registry-server" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.185775 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.188297 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.188483 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.194271 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv"] Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.274485 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3c4ecc6-1490-4170-9dd3-122c4417e62b-config-volume\") pod \"collect-profiles-29524980-zb8sv\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.274965 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmqv7\" (UniqueName: \"kubernetes.io/projected/e3c4ecc6-1490-4170-9dd3-122c4417e62b-kube-api-access-wmqv7\") pod \"collect-profiles-29524980-zb8sv\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.275134 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3c4ecc6-1490-4170-9dd3-122c4417e62b-secret-volume\") pod \"collect-profiles-29524980-zb8sv\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.376931 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmqv7\" (UniqueName: \"kubernetes.io/projected/e3c4ecc6-1490-4170-9dd3-122c4417e62b-kube-api-access-wmqv7\") pod \"collect-profiles-29524980-zb8sv\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.377037 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3c4ecc6-1490-4170-9dd3-122c4417e62b-secret-volume\") pod \"collect-profiles-29524980-zb8sv\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.377092 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3c4ecc6-1490-4170-9dd3-122c4417e62b-config-volume\") pod \"collect-profiles-29524980-zb8sv\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.378027 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3c4ecc6-1490-4170-9dd3-122c4417e62b-config-volume\") pod \"collect-profiles-29524980-zb8sv\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.384056 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3c4ecc6-1490-4170-9dd3-122c4417e62b-secret-volume\") pod \"collect-profiles-29524980-zb8sv\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.392914 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmqv7\" (UniqueName: \"kubernetes.io/projected/e3c4ecc6-1490-4170-9dd3-122c4417e62b-kube-api-access-wmqv7\") pod \"collect-profiles-29524980-zb8sv\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.507985 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:00 crc kubenswrapper[4873]: I0219 11:00:00.964077 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv"] Feb 19 11:00:01 crc kubenswrapper[4873]: I0219 11:00:01.398532 4873 generic.go:334] "Generic (PLEG): container finished" podID="e3c4ecc6-1490-4170-9dd3-122c4417e62b" containerID="8127d29f0285689637090a99935e6d60c3fd803febae61bd0dc63c50815f28db" exitCode=0 Feb 19 11:00:01 crc kubenswrapper[4873]: I0219 11:00:01.398589 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" event={"ID":"e3c4ecc6-1490-4170-9dd3-122c4417e62b","Type":"ContainerDied","Data":"8127d29f0285689637090a99935e6d60c3fd803febae61bd0dc63c50815f28db"} Feb 19 11:00:01 crc kubenswrapper[4873]: I0219 11:00:01.398808 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" event={"ID":"e3c4ecc6-1490-4170-9dd3-122c4417e62b","Type":"ContainerStarted","Data":"f2c237e4fd2699824c9cd48a626f719ce97725405815fb96cb5e98e2747d0a3d"} Feb 19 11:00:02 crc kubenswrapper[4873]: I0219 11:00:02.790272 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:02 crc kubenswrapper[4873]: I0219 11:00:02.929931 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3c4ecc6-1490-4170-9dd3-122c4417e62b-secret-volume\") pod \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " Feb 19 11:00:02 crc kubenswrapper[4873]: I0219 11:00:02.930114 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmqv7\" (UniqueName: \"kubernetes.io/projected/e3c4ecc6-1490-4170-9dd3-122c4417e62b-kube-api-access-wmqv7\") pod \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " Feb 19 11:00:02 crc kubenswrapper[4873]: I0219 11:00:02.930188 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3c4ecc6-1490-4170-9dd3-122c4417e62b-config-volume\") pod \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\" (UID: \"e3c4ecc6-1490-4170-9dd3-122c4417e62b\") " Feb 19 11:00:02 crc kubenswrapper[4873]: I0219 11:00:02.930784 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3c4ecc6-1490-4170-9dd3-122c4417e62b-config-volume" (OuterVolumeSpecName: "config-volume") pod "e3c4ecc6-1490-4170-9dd3-122c4417e62b" (UID: "e3c4ecc6-1490-4170-9dd3-122c4417e62b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 11:00:02 crc kubenswrapper[4873]: I0219 11:00:02.936642 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c4ecc6-1490-4170-9dd3-122c4417e62b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e3c4ecc6-1490-4170-9dd3-122c4417e62b" (UID: "e3c4ecc6-1490-4170-9dd3-122c4417e62b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 11:00:02 crc kubenswrapper[4873]: I0219 11:00:02.937684 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c4ecc6-1490-4170-9dd3-122c4417e62b-kube-api-access-wmqv7" (OuterVolumeSpecName: "kube-api-access-wmqv7") pod "e3c4ecc6-1490-4170-9dd3-122c4417e62b" (UID: "e3c4ecc6-1490-4170-9dd3-122c4417e62b"). InnerVolumeSpecName "kube-api-access-wmqv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:00:03 crc kubenswrapper[4873]: I0219 11:00:03.032769 4873 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3c4ecc6-1490-4170-9dd3-122c4417e62b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 11:00:03 crc kubenswrapper[4873]: I0219 11:00:03.032807 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmqv7\" (UniqueName: \"kubernetes.io/projected/e3c4ecc6-1490-4170-9dd3-122c4417e62b-kube-api-access-wmqv7\") on node \"crc\" DevicePath \"\"" Feb 19 11:00:03 crc kubenswrapper[4873]: I0219 11:00:03.032818 4873 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3c4ecc6-1490-4170-9dd3-122c4417e62b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 11:00:03 crc kubenswrapper[4873]: I0219 11:00:03.437291 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" event={"ID":"e3c4ecc6-1490-4170-9dd3-122c4417e62b","Type":"ContainerDied","Data":"f2c237e4fd2699824c9cd48a626f719ce97725405815fb96cb5e98e2747d0a3d"} Feb 19 11:00:03 crc kubenswrapper[4873]: I0219 11:00:03.437338 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2c237e4fd2699824c9cd48a626f719ce97725405815fb96cb5e98e2747d0a3d" Feb 19 11:00:03 crc kubenswrapper[4873]: I0219 11:00:03.437692 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524980-zb8sv" Feb 19 11:00:03 crc kubenswrapper[4873]: I0219 11:00:03.889032 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58"] Feb 19 11:00:03 crc kubenswrapper[4873]: I0219 11:00:03.901882 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524935-sjf58"] Feb 19 11:00:04 crc kubenswrapper[4873]: I0219 11:00:04.484063 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 11:00:04 crc kubenswrapper[4873]: E0219 11:00:04.484401 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:00:05 crc kubenswrapper[4873]: I0219 11:00:05.500005 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb4ec2bd-4c16-4682-873a-4fbdcc5d9580" path="/var/lib/kubelet/pods/fb4ec2bd-4c16-4682-873a-4fbdcc5d9580/volumes" Feb 19 11:00:09 crc kubenswrapper[4873]: E0219 11:00:09.122716 4873 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.156:35932->38.102.83.156:45689: write tcp 38.102.83.156:35932->38.102.83.156:45689: write: connection reset by peer Feb 19 11:00:19 crc kubenswrapper[4873]: I0219 11:00:19.484243 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 11:00:19 crc kubenswrapper[4873]: E0219 11:00:19.485012 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:00:33 crc kubenswrapper[4873]: I0219 11:00:33.484645 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 11:00:33 crc kubenswrapper[4873]: E0219 11:00:33.485596 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:00:47 crc kubenswrapper[4873]: I0219 11:00:47.484277 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 11:00:47 crc kubenswrapper[4873]: E0219 11:00:47.485055 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:00:59 crc kubenswrapper[4873]: I0219 11:00:59.485195 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 11:00:59 crc kubenswrapper[4873]: E0219 11:00:59.486227 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.156856 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29524981-pxsmx"] Feb 19 11:01:00 crc kubenswrapper[4873]: E0219 11:01:00.157787 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c4ecc6-1490-4170-9dd3-122c4417e62b" containerName="collect-profiles" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.157884 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c4ecc6-1490-4170-9dd3-122c4417e62b" containerName="collect-profiles" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.158183 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c4ecc6-1490-4170-9dd3-122c4417e62b" containerName="collect-profiles" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.158950 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.176977 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29524981-pxsmx"] Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.260797 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-combined-ca-bundle\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.260893 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-config-data\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.261039 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42s7q\" (UniqueName: \"kubernetes.io/projected/3f08f0c4-870d-4d9a-8a82-ce22827ce779-kube-api-access-42s7q\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.261141 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-fernet-keys\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.362786 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42s7q\" (UniqueName: \"kubernetes.io/projected/3f08f0c4-870d-4d9a-8a82-ce22827ce779-kube-api-access-42s7q\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.362928 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-fernet-keys\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.363052 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-combined-ca-bundle\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.363142 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-config-data\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.384977 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-fernet-keys\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.391876 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42s7q\" (UniqueName: \"kubernetes.io/projected/3f08f0c4-870d-4d9a-8a82-ce22827ce779-kube-api-access-42s7q\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.392338 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-config-data\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.392544 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-combined-ca-bundle\") pod \"keystone-cron-29524981-pxsmx\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.484165 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:00 crc kubenswrapper[4873]: I0219 11:01:00.977332 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29524981-pxsmx"] Feb 19 11:01:01 crc kubenswrapper[4873]: I0219 11:01:01.986795 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29524981-pxsmx" event={"ID":"3f08f0c4-870d-4d9a-8a82-ce22827ce779","Type":"ContainerStarted","Data":"8690a5794fdd471bca42526814dea6677eb6e29b84003855bd13530e981b2110"} Feb 19 11:01:01 crc kubenswrapper[4873]: I0219 11:01:01.987211 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29524981-pxsmx" event={"ID":"3f08f0c4-870d-4d9a-8a82-ce22827ce779","Type":"ContainerStarted","Data":"4033576ca7f24748022d1be039a2a747e580f1f4e4d8f70a15ebec2763443c71"} Feb 19 11:01:02 crc kubenswrapper[4873]: I0219 11:01:02.004629 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29524981-pxsmx" podStartSLOduration=2.004603053 podStartE2EDuration="2.004603053s" podCreationTimestamp="2026-02-19 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 11:01:02.00368723 +0000 UTC m=+4571.293118868" watchObservedRunningTime="2026-02-19 11:01:02.004603053 +0000 UTC m=+4571.294034691" Feb 19 11:01:03 crc kubenswrapper[4873]: I0219 11:01:03.218543 4873 scope.go:117] "RemoveContainer" containerID="51b872a4026735697f0f9cc00b395427fbc06efd93a529d94f5319e0d220778e" Feb 19 11:01:06 crc kubenswrapper[4873]: I0219 11:01:06.031938 4873 generic.go:334] "Generic (PLEG): container finished" podID="3f08f0c4-870d-4d9a-8a82-ce22827ce779" containerID="8690a5794fdd471bca42526814dea6677eb6e29b84003855bd13530e981b2110" exitCode=0 Feb 19 11:01:06 crc kubenswrapper[4873]: I0219 11:01:06.032236 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29524981-pxsmx" event={"ID":"3f08f0c4-870d-4d9a-8a82-ce22827ce779","Type":"ContainerDied","Data":"8690a5794fdd471bca42526814dea6677eb6e29b84003855bd13530e981b2110"} Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.457773 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.516396 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-combined-ca-bundle\") pod \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.516669 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-fernet-keys\") pod \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.516692 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-config-data\") pod \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.516799 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42s7q\" (UniqueName: \"kubernetes.io/projected/3f08f0c4-870d-4d9a-8a82-ce22827ce779-kube-api-access-42s7q\") pod \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\" (UID: \"3f08f0c4-870d-4d9a-8a82-ce22827ce779\") " Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.522544 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3f08f0c4-870d-4d9a-8a82-ce22827ce779" (UID: "3f08f0c4-870d-4d9a-8a82-ce22827ce779"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.522551 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f08f0c4-870d-4d9a-8a82-ce22827ce779-kube-api-access-42s7q" (OuterVolumeSpecName: "kube-api-access-42s7q") pod "3f08f0c4-870d-4d9a-8a82-ce22827ce779" (UID: "3f08f0c4-870d-4d9a-8a82-ce22827ce779"). InnerVolumeSpecName "kube-api-access-42s7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.546825 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f08f0c4-870d-4d9a-8a82-ce22827ce779" (UID: "3f08f0c4-870d-4d9a-8a82-ce22827ce779"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.575247 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-config-data" (OuterVolumeSpecName: "config-data") pod "3f08f0c4-870d-4d9a-8a82-ce22827ce779" (UID: "3f08f0c4-870d-4d9a-8a82-ce22827ce779"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.619450 4873 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.619479 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.619492 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42s7q\" (UniqueName: \"kubernetes.io/projected/3f08f0c4-870d-4d9a-8a82-ce22827ce779-kube-api-access-42s7q\") on node \"crc\" DevicePath \"\"" Feb 19 11:01:07 crc kubenswrapper[4873]: I0219 11:01:07.619502 4873 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f08f0c4-870d-4d9a-8a82-ce22827ce779-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 11:01:08 crc kubenswrapper[4873]: I0219 11:01:08.053909 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29524981-pxsmx" event={"ID":"3f08f0c4-870d-4d9a-8a82-ce22827ce779","Type":"ContainerDied","Data":"4033576ca7f24748022d1be039a2a747e580f1f4e4d8f70a15ebec2763443c71"} Feb 19 11:01:08 crc kubenswrapper[4873]: I0219 11:01:08.054587 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4033576ca7f24748022d1be039a2a747e580f1f4e4d8f70a15ebec2763443c71" Feb 19 11:01:08 crc kubenswrapper[4873]: I0219 11:01:08.053973 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29524981-pxsmx" Feb 19 11:01:13 crc kubenswrapper[4873]: I0219 11:01:13.484130 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 11:01:13 crc kubenswrapper[4873]: E0219 11:01:13.484737 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:01:27 crc kubenswrapper[4873]: I0219 11:01:27.484725 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 11:01:27 crc kubenswrapper[4873]: E0219 11:01:27.485907 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:01:38 crc kubenswrapper[4873]: I0219 11:01:38.485166 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 11:01:38 crc kubenswrapper[4873]: E0219 11:01:38.485911 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:01:53 crc kubenswrapper[4873]: I0219 11:01:53.485237 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 11:01:54 crc kubenswrapper[4873]: I0219 11:01:54.486086 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"cb43480cbf19d4507b02920f7c71a6827821b09f9e9f251bd0c0f1803ed97739"} Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.281296 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-plncc"] Feb 19 11:03:09 crc kubenswrapper[4873]: E0219 11:03:09.282146 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f08f0c4-870d-4d9a-8a82-ce22827ce779" containerName="keystone-cron" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.282158 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f08f0c4-870d-4d9a-8a82-ce22827ce779" containerName="keystone-cron" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.282340 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f08f0c4-870d-4d9a-8a82-ce22827ce779" containerName="keystone-cron" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.283747 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.290997 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-plncc"] Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.395632 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-utilities\") pod \"redhat-operators-plncc\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.395683 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-catalog-content\") pod \"redhat-operators-plncc\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.395713 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl4td\" (UniqueName: \"kubernetes.io/projected/6e290883-f526-41db-a353-55a50f744490-kube-api-access-gl4td\") pod \"redhat-operators-plncc\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.498298 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-utilities\") pod \"redhat-operators-plncc\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.498340 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-catalog-content\") pod \"redhat-operators-plncc\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.498372 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl4td\" (UniqueName: \"kubernetes.io/projected/6e290883-f526-41db-a353-55a50f744490-kube-api-access-gl4td\") pod \"redhat-operators-plncc\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.498855 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-catalog-content\") pod \"redhat-operators-plncc\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.499023 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-utilities\") pod \"redhat-operators-plncc\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.529311 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl4td\" (UniqueName: \"kubernetes.io/projected/6e290883-f526-41db-a353-55a50f744490-kube-api-access-gl4td\") pod \"redhat-operators-plncc\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:09 crc kubenswrapper[4873]: I0219 11:03:09.623664 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:10 crc kubenswrapper[4873]: I0219 11:03:10.203501 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-plncc"] Feb 19 11:03:11 crc kubenswrapper[4873]: I0219 11:03:11.213254 4873 generic.go:334] "Generic (PLEG): container finished" podID="6e290883-f526-41db-a353-55a50f744490" containerID="6de9ee853a1df07b7a365ba1104b2be88eabbf7c1da72e76e0f20f1e8736adbe" exitCode=0 Feb 19 11:03:11 crc kubenswrapper[4873]: I0219 11:03:11.213315 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plncc" event={"ID":"6e290883-f526-41db-a353-55a50f744490","Type":"ContainerDied","Data":"6de9ee853a1df07b7a365ba1104b2be88eabbf7c1da72e76e0f20f1e8736adbe"} Feb 19 11:03:11 crc kubenswrapper[4873]: I0219 11:03:11.213581 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plncc" event={"ID":"6e290883-f526-41db-a353-55a50f744490","Type":"ContainerStarted","Data":"5e8aa772e7b0649a59737f94f1f93bbab70267c1d35f86ec43c903ae04595e28"} Feb 19 11:03:13 crc kubenswrapper[4873]: I0219 11:03:13.232256 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plncc" event={"ID":"6e290883-f526-41db-a353-55a50f744490","Type":"ContainerStarted","Data":"72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd"} Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.236928 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-svn5x"] Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.239628 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.258087 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svn5x"] Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.290880 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-utilities\") pod \"redhat-marketplace-svn5x\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.291311 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-catalog-content\") pod \"redhat-marketplace-svn5x\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.291501 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrlpd\" (UniqueName: \"kubernetes.io/projected/e236a7a3-ab66-4f76-aba7-ffed81663143-kube-api-access-xrlpd\") pod \"redhat-marketplace-svn5x\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.393150 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-catalog-content\") pod \"redhat-marketplace-svn5x\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.393241 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrlpd\" (UniqueName: \"kubernetes.io/projected/e236a7a3-ab66-4f76-aba7-ffed81663143-kube-api-access-xrlpd\") pod \"redhat-marketplace-svn5x\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.393353 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-utilities\") pod \"redhat-marketplace-svn5x\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.393818 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-catalog-content\") pod \"redhat-marketplace-svn5x\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.393831 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-utilities\") pod \"redhat-marketplace-svn5x\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.420370 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrlpd\" (UniqueName: \"kubernetes.io/projected/e236a7a3-ab66-4f76-aba7-ffed81663143-kube-api-access-xrlpd\") pod \"redhat-marketplace-svn5x\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:17 crc kubenswrapper[4873]: I0219 11:03:17.557077 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:18 crc kubenswrapper[4873]: I0219 11:03:18.142471 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svn5x"] Feb 19 11:03:18 crc kubenswrapper[4873]: I0219 11:03:18.312552 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svn5x" event={"ID":"e236a7a3-ab66-4f76-aba7-ffed81663143","Type":"ContainerStarted","Data":"6196d4d73d0291d966aa2df84cf6131187ad22159d39b95f76a24db6129df9e9"} Feb 19 11:03:18 crc kubenswrapper[4873]: I0219 11:03:18.315518 4873 generic.go:334] "Generic (PLEG): container finished" podID="6e290883-f526-41db-a353-55a50f744490" containerID="72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd" exitCode=0 Feb 19 11:03:18 crc kubenswrapper[4873]: I0219 11:03:18.315547 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plncc" event={"ID":"6e290883-f526-41db-a353-55a50f744490","Type":"ContainerDied","Data":"72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd"} Feb 19 11:03:19 crc kubenswrapper[4873]: I0219 11:03:19.327538 4873 generic.go:334] "Generic (PLEG): container finished" podID="e236a7a3-ab66-4f76-aba7-ffed81663143" containerID="20f09bfbbc23c5c4835fd8a5c5af61ad45a563038f321621749eef3a9dd96f58" exitCode=0 Feb 19 11:03:19 crc kubenswrapper[4873]: I0219 11:03:19.327848 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svn5x" event={"ID":"e236a7a3-ab66-4f76-aba7-ffed81663143","Type":"ContainerDied","Data":"20f09bfbbc23c5c4835fd8a5c5af61ad45a563038f321621749eef3a9dd96f58"} Feb 19 11:03:20 crc kubenswrapper[4873]: I0219 11:03:20.340675 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plncc" event={"ID":"6e290883-f526-41db-a353-55a50f744490","Type":"ContainerStarted","Data":"36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3"} Feb 19 11:03:20 crc kubenswrapper[4873]: I0219 11:03:20.362580 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-plncc" podStartSLOduration=3.243244492 podStartE2EDuration="11.36255459s" podCreationTimestamp="2026-02-19 11:03:09 +0000 UTC" firstStartedPulling="2026-02-19 11:03:11.216199428 +0000 UTC m=+4700.505631066" lastFinishedPulling="2026-02-19 11:03:19.335509526 +0000 UTC m=+4708.624941164" observedRunningTime="2026-02-19 11:03:20.356722306 +0000 UTC m=+4709.646153954" watchObservedRunningTime="2026-02-19 11:03:20.36255459 +0000 UTC m=+4709.651986248" Feb 19 11:03:21 crc kubenswrapper[4873]: I0219 11:03:21.372504 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svn5x" event={"ID":"e236a7a3-ab66-4f76-aba7-ffed81663143","Type":"ContainerStarted","Data":"dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4"} Feb 19 11:03:22 crc kubenswrapper[4873]: I0219 11:03:22.385245 4873 generic.go:334] "Generic (PLEG): container finished" podID="e236a7a3-ab66-4f76-aba7-ffed81663143" containerID="dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4" exitCode=0 Feb 19 11:03:22 crc kubenswrapper[4873]: I0219 11:03:22.385325 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svn5x" event={"ID":"e236a7a3-ab66-4f76-aba7-ffed81663143","Type":"ContainerDied","Data":"dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4"} Feb 19 11:03:25 crc kubenswrapper[4873]: I0219 11:03:25.416837 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svn5x" event={"ID":"e236a7a3-ab66-4f76-aba7-ffed81663143","Type":"ContainerStarted","Data":"3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511"} Feb 19 11:03:25 crc kubenswrapper[4873]: I0219 11:03:25.440289 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-svn5x" podStartSLOduration=3.891249643 podStartE2EDuration="8.440273866s" podCreationTimestamp="2026-02-19 11:03:17 +0000 UTC" firstStartedPulling="2026-02-19 11:03:19.334640943 +0000 UTC m=+4708.624072581" lastFinishedPulling="2026-02-19 11:03:23.883665166 +0000 UTC m=+4713.173096804" observedRunningTime="2026-02-19 11:03:25.4396756 +0000 UTC m=+4714.729107238" watchObservedRunningTime="2026-02-19 11:03:25.440273866 +0000 UTC m=+4714.729705504" Feb 19 11:03:27 crc kubenswrapper[4873]: I0219 11:03:27.558193 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:27 crc kubenswrapper[4873]: I0219 11:03:27.558532 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:27 crc kubenswrapper[4873]: I0219 11:03:27.608380 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:29 crc kubenswrapper[4873]: I0219 11:03:29.625671 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:29 crc kubenswrapper[4873]: I0219 11:03:29.626004 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:30 crc kubenswrapper[4873]: I0219 11:03:30.673542 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-plncc" podUID="6e290883-f526-41db-a353-55a50f744490" containerName="registry-server" probeResult="failure" output=< Feb 19 11:03:30 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 11:03:30 crc kubenswrapper[4873]: > Feb 19 11:03:37 crc kubenswrapper[4873]: I0219 11:03:37.610998 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:37 crc kubenswrapper[4873]: I0219 11:03:37.664332 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svn5x"] Feb 19 11:03:38 crc kubenswrapper[4873]: I0219 11:03:38.531809 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-svn5x" podUID="e236a7a3-ab66-4f76-aba7-ffed81663143" containerName="registry-server" containerID="cri-o://3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511" gracePeriod=2 Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.041786 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.177974 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrlpd\" (UniqueName: \"kubernetes.io/projected/e236a7a3-ab66-4f76-aba7-ffed81663143-kube-api-access-xrlpd\") pod \"e236a7a3-ab66-4f76-aba7-ffed81663143\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.178152 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-utilities\") pod \"e236a7a3-ab66-4f76-aba7-ffed81663143\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.178185 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-catalog-content\") pod \"e236a7a3-ab66-4f76-aba7-ffed81663143\" (UID: \"e236a7a3-ab66-4f76-aba7-ffed81663143\") " Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.179010 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-utilities" (OuterVolumeSpecName: "utilities") pod "e236a7a3-ab66-4f76-aba7-ffed81663143" (UID: "e236a7a3-ab66-4f76-aba7-ffed81663143"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.208302 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e236a7a3-ab66-4f76-aba7-ffed81663143-kube-api-access-xrlpd" (OuterVolumeSpecName: "kube-api-access-xrlpd") pod "e236a7a3-ab66-4f76-aba7-ffed81663143" (UID: "e236a7a3-ab66-4f76-aba7-ffed81663143"). InnerVolumeSpecName "kube-api-access-xrlpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.246352 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e236a7a3-ab66-4f76-aba7-ffed81663143" (UID: "e236a7a3-ab66-4f76-aba7-ffed81663143"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.281755 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrlpd\" (UniqueName: \"kubernetes.io/projected/e236a7a3-ab66-4f76-aba7-ffed81663143-kube-api-access-xrlpd\") on node \"crc\" DevicePath \"\"" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.281815 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.281828 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e236a7a3-ab66-4f76-aba7-ffed81663143-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.541693 4873 generic.go:334] "Generic (PLEG): container finished" podID="e236a7a3-ab66-4f76-aba7-ffed81663143" containerID="3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511" exitCode=0 Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.541750 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svn5x" event={"ID":"e236a7a3-ab66-4f76-aba7-ffed81663143","Type":"ContainerDied","Data":"3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511"} Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.541784 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svn5x" event={"ID":"e236a7a3-ab66-4f76-aba7-ffed81663143","Type":"ContainerDied","Data":"6196d4d73d0291d966aa2df84cf6131187ad22159d39b95f76a24db6129df9e9"} Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.541800 4873 scope.go:117] "RemoveContainer" containerID="3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.541918 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svn5x" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.565216 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svn5x"] Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.565632 4873 scope.go:117] "RemoveContainer" containerID="dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.578560 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-svn5x"] Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.589716 4873 scope.go:117] "RemoveContainer" containerID="20f09bfbbc23c5c4835fd8a5c5af61ad45a563038f321621749eef3a9dd96f58" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.637744 4873 scope.go:117] "RemoveContainer" containerID="3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511" Feb 19 11:03:39 crc kubenswrapper[4873]: E0219 11:03:39.638192 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511\": container with ID starting with 3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511 not found: ID does not exist" containerID="3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.638231 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511"} err="failed to get container status \"3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511\": rpc error: code = NotFound desc = could not find container \"3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511\": container with ID starting with 3f172914177b2a80a2695f8e24b026849a54c503b1183af840362134ca514511 not found: ID does not exist" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.638264 4873 scope.go:117] "RemoveContainer" containerID="dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4" Feb 19 11:03:39 crc kubenswrapper[4873]: E0219 11:03:39.638565 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4\": container with ID starting with dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4 not found: ID does not exist" containerID="dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.638589 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4"} err="failed to get container status \"dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4\": rpc error: code = NotFound desc = could not find container \"dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4\": container with ID starting with dd393fb68f4b4f99e55a564abf28d6d2280686070d70f6821f08f15dc6520ca4 not found: ID does not exist" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.638604 4873 scope.go:117] "RemoveContainer" containerID="20f09bfbbc23c5c4835fd8a5c5af61ad45a563038f321621749eef3a9dd96f58" Feb 19 11:03:39 crc kubenswrapper[4873]: E0219 11:03:39.638836 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f09bfbbc23c5c4835fd8a5c5af61ad45a563038f321621749eef3a9dd96f58\": container with ID starting with 20f09bfbbc23c5c4835fd8a5c5af61ad45a563038f321621749eef3a9dd96f58 not found: ID does not exist" containerID="20f09bfbbc23c5c4835fd8a5c5af61ad45a563038f321621749eef3a9dd96f58" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.638860 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f09bfbbc23c5c4835fd8a5c5af61ad45a563038f321621749eef3a9dd96f58"} err="failed to get container status \"20f09bfbbc23c5c4835fd8a5c5af61ad45a563038f321621749eef3a9dd96f58\": rpc error: code = NotFound desc = could not find container \"20f09bfbbc23c5c4835fd8a5c5af61ad45a563038f321621749eef3a9dd96f58\": container with ID starting with 20f09bfbbc23c5c4835fd8a5c5af61ad45a563038f321621749eef3a9dd96f58 not found: ID does not exist" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.674411 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:39 crc kubenswrapper[4873]: I0219 11:03:39.722916 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:40 crc kubenswrapper[4873]: I0219 11:03:40.847036 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-plncc"] Feb 19 11:03:41 crc kubenswrapper[4873]: I0219 11:03:41.496503 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e236a7a3-ab66-4f76-aba7-ffed81663143" path="/var/lib/kubelet/pods/e236a7a3-ab66-4f76-aba7-ffed81663143/volumes" Feb 19 11:03:41 crc kubenswrapper[4873]: I0219 11:03:41.566226 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-plncc" podUID="6e290883-f526-41db-a353-55a50f744490" containerName="registry-server" containerID="cri-o://36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3" gracePeriod=2 Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.074066 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.146869 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-utilities\") pod \"6e290883-f526-41db-a353-55a50f744490\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.147124 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-catalog-content\") pod \"6e290883-f526-41db-a353-55a50f744490\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.147226 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl4td\" (UniqueName: \"kubernetes.io/projected/6e290883-f526-41db-a353-55a50f744490-kube-api-access-gl4td\") pod \"6e290883-f526-41db-a353-55a50f744490\" (UID: \"6e290883-f526-41db-a353-55a50f744490\") " Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.148460 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-utilities" (OuterVolumeSpecName: "utilities") pod "6e290883-f526-41db-a353-55a50f744490" (UID: "6e290883-f526-41db-a353-55a50f744490"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.250661 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.272265 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e290883-f526-41db-a353-55a50f744490" (UID: "6e290883-f526-41db-a353-55a50f744490"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.352872 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e290883-f526-41db-a353-55a50f744490-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.577504 4873 generic.go:334] "Generic (PLEG): container finished" podID="6e290883-f526-41db-a353-55a50f744490" containerID="36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3" exitCode=0 Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.577556 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plncc" event={"ID":"6e290883-f526-41db-a353-55a50f744490","Type":"ContainerDied","Data":"36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3"} Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.577587 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plncc" event={"ID":"6e290883-f526-41db-a353-55a50f744490","Type":"ContainerDied","Data":"5e8aa772e7b0649a59737f94f1f93bbab70267c1d35f86ec43c903ae04595e28"} Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.577609 4873 scope.go:117] "RemoveContainer" containerID="36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.577750 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plncc" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.602915 4873 scope.go:117] "RemoveContainer" containerID="72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.785268 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e290883-f526-41db-a353-55a50f744490-kube-api-access-gl4td" (OuterVolumeSpecName: "kube-api-access-gl4td") pod "6e290883-f526-41db-a353-55a50f744490" (UID: "6e290883-f526-41db-a353-55a50f744490"). InnerVolumeSpecName "kube-api-access-gl4td". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.821042 4873 scope.go:117] "RemoveContainer" containerID="6de9ee853a1df07b7a365ba1104b2be88eabbf7c1da72e76e0f20f1e8736adbe" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.865298 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl4td\" (UniqueName: \"kubernetes.io/projected/6e290883-f526-41db-a353-55a50f744490-kube-api-access-gl4td\") on node \"crc\" DevicePath \"\"" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.920278 4873 scope.go:117] "RemoveContainer" containerID="36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3" Feb 19 11:03:42 crc kubenswrapper[4873]: E0219 11:03:42.920905 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3\": container with ID starting with 36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3 not found: ID does not exist" containerID="36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.921077 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3"} err="failed to get container status \"36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3\": rpc error: code = NotFound desc = could not find container \"36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3\": container with ID starting with 36dd93803a9bba619ed90e1911c35791dfca87f72b721514b9e79a923453a2b3 not found: ID does not exist" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.921276 4873 scope.go:117] "RemoveContainer" containerID="72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd" Feb 19 11:03:42 crc kubenswrapper[4873]: E0219 11:03:42.921701 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd\": container with ID starting with 72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd not found: ID does not exist" containerID="72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.921724 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd"} err="failed to get container status \"72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd\": rpc error: code = NotFound desc = could not find container \"72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd\": container with ID starting with 72c590d804613f1c773acf673eb71ec0069c6e7e290132e13d2a88008e3611cd not found: ID does not exist" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.921737 4873 scope.go:117] "RemoveContainer" containerID="6de9ee853a1df07b7a365ba1104b2be88eabbf7c1da72e76e0f20f1e8736adbe" Feb 19 11:03:42 crc kubenswrapper[4873]: E0219 11:03:42.921916 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de9ee853a1df07b7a365ba1104b2be88eabbf7c1da72e76e0f20f1e8736adbe\": container with ID starting with 6de9ee853a1df07b7a365ba1104b2be88eabbf7c1da72e76e0f20f1e8736adbe not found: ID does not exist" containerID="6de9ee853a1df07b7a365ba1104b2be88eabbf7c1da72e76e0f20f1e8736adbe" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.921937 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de9ee853a1df07b7a365ba1104b2be88eabbf7c1da72e76e0f20f1e8736adbe"} err="failed to get container status \"6de9ee853a1df07b7a365ba1104b2be88eabbf7c1da72e76e0f20f1e8736adbe\": rpc error: code = NotFound desc = could not find container \"6de9ee853a1df07b7a365ba1104b2be88eabbf7c1da72e76e0f20f1e8736adbe\": container with ID starting with 6de9ee853a1df07b7a365ba1104b2be88eabbf7c1da72e76e0f20f1e8736adbe not found: ID does not exist" Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.983119 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-plncc"] Feb 19 11:03:42 crc kubenswrapper[4873]: I0219 11:03:42.993357 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-plncc"] Feb 19 11:03:43 crc kubenswrapper[4873]: I0219 11:03:43.496819 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e290883-f526-41db-a353-55a50f744490" path="/var/lib/kubelet/pods/6e290883-f526-41db-a353-55a50f744490/volumes" Feb 19 11:04:18 crc kubenswrapper[4873]: I0219 11:04:18.240699 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:04:18 crc kubenswrapper[4873]: I0219 11:04:18.241371 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:04:48 crc kubenswrapper[4873]: I0219 11:04:48.240718 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:04:48 crc kubenswrapper[4873]: I0219 11:04:48.241473 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.737743 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mk7zr"] Feb 19 11:05:03 crc kubenswrapper[4873]: E0219 11:05:03.739177 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e290883-f526-41db-a353-55a50f744490" containerName="registry-server" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.739204 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e290883-f526-41db-a353-55a50f744490" containerName="registry-server" Feb 19 11:05:03 crc kubenswrapper[4873]: E0219 11:05:03.739235 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e236a7a3-ab66-4f76-aba7-ffed81663143" containerName="extract-content" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.739243 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e236a7a3-ab66-4f76-aba7-ffed81663143" containerName="extract-content" Feb 19 11:05:03 crc kubenswrapper[4873]: E0219 11:05:03.739261 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e290883-f526-41db-a353-55a50f744490" containerName="extract-content" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.739269 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e290883-f526-41db-a353-55a50f744490" containerName="extract-content" Feb 19 11:05:03 crc kubenswrapper[4873]: E0219 11:05:03.739300 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e236a7a3-ab66-4f76-aba7-ffed81663143" containerName="extract-utilities" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.739309 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e236a7a3-ab66-4f76-aba7-ffed81663143" containerName="extract-utilities" Feb 19 11:05:03 crc kubenswrapper[4873]: E0219 11:05:03.739322 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e290883-f526-41db-a353-55a50f744490" containerName="extract-utilities" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.739330 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e290883-f526-41db-a353-55a50f744490" containerName="extract-utilities" Feb 19 11:05:03 crc kubenswrapper[4873]: E0219 11:05:03.739344 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e236a7a3-ab66-4f76-aba7-ffed81663143" containerName="registry-server" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.739351 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e236a7a3-ab66-4f76-aba7-ffed81663143" containerName="registry-server" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.739567 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e236a7a3-ab66-4f76-aba7-ffed81663143" containerName="registry-server" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.739590 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e290883-f526-41db-a353-55a50f744490" containerName="registry-server" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.741029 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.744305 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-utilities\") pod \"community-operators-mk7zr\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.744426 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg72n\" (UniqueName: \"kubernetes.io/projected/0f47487c-2e96-41ed-963c-de6c2f8bf152-kube-api-access-hg72n\") pod \"community-operators-mk7zr\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.744588 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-catalog-content\") pod \"community-operators-mk7zr\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.750042 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mk7zr"] Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.846138 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-utilities\") pod \"community-operators-mk7zr\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.846226 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg72n\" (UniqueName: \"kubernetes.io/projected/0f47487c-2e96-41ed-963c-de6c2f8bf152-kube-api-access-hg72n\") pod \"community-operators-mk7zr\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.846247 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-catalog-content\") pod \"community-operators-mk7zr\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.846722 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-catalog-content\") pod \"community-operators-mk7zr\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.846764 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-utilities\") pod \"community-operators-mk7zr\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:03 crc kubenswrapper[4873]: I0219 11:05:03.866437 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg72n\" (UniqueName: \"kubernetes.io/projected/0f47487c-2e96-41ed-963c-de6c2f8bf152-kube-api-access-hg72n\") pod \"community-operators-mk7zr\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:04 crc kubenswrapper[4873]: I0219 11:05:04.089221 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:05 crc kubenswrapper[4873]: I0219 11:05:05.038675 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mk7zr"] Feb 19 11:05:05 crc kubenswrapper[4873]: I0219 11:05:05.354856 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mk7zr" event={"ID":"0f47487c-2e96-41ed-963c-de6c2f8bf152","Type":"ContainerStarted","Data":"49e088349e642ccc00b12a3c3c974f30b5fd5aa45f7ce3ce9abd0f16f1f209ae"} Feb 19 11:05:06 crc kubenswrapper[4873]: I0219 11:05:06.366155 4873 generic.go:334] "Generic (PLEG): container finished" podID="0f47487c-2e96-41ed-963c-de6c2f8bf152" containerID="31b07f779794e549307f2d11327a6f33ea37e2331b358a4ed386e94c4d86c687" exitCode=0 Feb 19 11:05:06 crc kubenswrapper[4873]: I0219 11:05:06.366198 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mk7zr" event={"ID":"0f47487c-2e96-41ed-963c-de6c2f8bf152","Type":"ContainerDied","Data":"31b07f779794e549307f2d11327a6f33ea37e2331b358a4ed386e94c4d86c687"} Feb 19 11:05:06 crc kubenswrapper[4873]: I0219 11:05:06.369291 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 11:05:08 crc kubenswrapper[4873]: I0219 11:05:08.403240 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mk7zr" event={"ID":"0f47487c-2e96-41ed-963c-de6c2f8bf152","Type":"ContainerStarted","Data":"202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca"} Feb 19 11:05:09 crc kubenswrapper[4873]: I0219 11:05:09.413473 4873 generic.go:334] "Generic (PLEG): container finished" podID="0f47487c-2e96-41ed-963c-de6c2f8bf152" containerID="202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca" exitCode=0 Feb 19 11:05:09 crc kubenswrapper[4873]: I0219 11:05:09.413524 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mk7zr" event={"ID":"0f47487c-2e96-41ed-963c-de6c2f8bf152","Type":"ContainerDied","Data":"202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca"} Feb 19 11:05:10 crc kubenswrapper[4873]: I0219 11:05:10.424079 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mk7zr" event={"ID":"0f47487c-2e96-41ed-963c-de6c2f8bf152","Type":"ContainerStarted","Data":"41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a"} Feb 19 11:05:10 crc kubenswrapper[4873]: I0219 11:05:10.445649 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mk7zr" podStartSLOduration=3.747127523 podStartE2EDuration="7.445627854s" podCreationTimestamp="2026-02-19 11:05:03 +0000 UTC" firstStartedPulling="2026-02-19 11:05:06.368987145 +0000 UTC m=+4815.658418803" lastFinishedPulling="2026-02-19 11:05:10.067487496 +0000 UTC m=+4819.356919134" observedRunningTime="2026-02-19 11:05:10.439966015 +0000 UTC m=+4819.729397653" watchObservedRunningTime="2026-02-19 11:05:10.445627854 +0000 UTC m=+4819.735059492" Feb 19 11:05:14 crc kubenswrapper[4873]: I0219 11:05:14.089594 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:14 crc kubenswrapper[4873]: I0219 11:05:14.090074 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:14 crc kubenswrapper[4873]: I0219 11:05:14.143843 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:18 crc kubenswrapper[4873]: I0219 11:05:18.240621 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:05:18 crc kubenswrapper[4873]: I0219 11:05:18.241374 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:05:18 crc kubenswrapper[4873]: I0219 11:05:18.242122 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 11:05:18 crc kubenswrapper[4873]: I0219 11:05:18.243180 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb43480cbf19d4507b02920f7c71a6827821b09f9e9f251bd0c0f1803ed97739"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 11:05:18 crc kubenswrapper[4873]: I0219 11:05:18.243271 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://cb43480cbf19d4507b02920f7c71a6827821b09f9e9f251bd0c0f1803ed97739" gracePeriod=600 Feb 19 11:05:19 crc kubenswrapper[4873]: I0219 11:05:19.513952 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="cb43480cbf19d4507b02920f7c71a6827821b09f9e9f251bd0c0f1803ed97739" exitCode=0 Feb 19 11:05:19 crc kubenswrapper[4873]: I0219 11:05:19.514055 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"cb43480cbf19d4507b02920f7c71a6827821b09f9e9f251bd0c0f1803ed97739"} Feb 19 11:05:19 crc kubenswrapper[4873]: I0219 11:05:19.514705 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7"} Feb 19 11:05:19 crc kubenswrapper[4873]: I0219 11:05:19.514737 4873 scope.go:117] "RemoveContainer" containerID="22be4fcac63d93394195fc713b152ebad1813a068085fb725288b1a582e6c2de" Feb 19 11:05:24 crc kubenswrapper[4873]: I0219 11:05:24.141660 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:24 crc kubenswrapper[4873]: I0219 11:05:24.228814 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mk7zr"] Feb 19 11:05:24 crc kubenswrapper[4873]: I0219 11:05:24.560248 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mk7zr" podUID="0f47487c-2e96-41ed-963c-de6c2f8bf152" containerName="registry-server" containerID="cri-o://41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a" gracePeriod=2 Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.034917 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.202894 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-catalog-content\") pod \"0f47487c-2e96-41ed-963c-de6c2f8bf152\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.202988 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg72n\" (UniqueName: \"kubernetes.io/projected/0f47487c-2e96-41ed-963c-de6c2f8bf152-kube-api-access-hg72n\") pod \"0f47487c-2e96-41ed-963c-de6c2f8bf152\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.203240 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-utilities\") pod \"0f47487c-2e96-41ed-963c-de6c2f8bf152\" (UID: \"0f47487c-2e96-41ed-963c-de6c2f8bf152\") " Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.204537 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-utilities" (OuterVolumeSpecName: "utilities") pod "0f47487c-2e96-41ed-963c-de6c2f8bf152" (UID: "0f47487c-2e96-41ed-963c-de6c2f8bf152"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.215319 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f47487c-2e96-41ed-963c-de6c2f8bf152-kube-api-access-hg72n" (OuterVolumeSpecName: "kube-api-access-hg72n") pod "0f47487c-2e96-41ed-963c-de6c2f8bf152" (UID: "0f47487c-2e96-41ed-963c-de6c2f8bf152"). InnerVolumeSpecName "kube-api-access-hg72n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.265854 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f47487c-2e96-41ed-963c-de6c2f8bf152" (UID: "0f47487c-2e96-41ed-963c-de6c2f8bf152"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.306271 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg72n\" (UniqueName: \"kubernetes.io/projected/0f47487c-2e96-41ed-963c-de6c2f8bf152-kube-api-access-hg72n\") on node \"crc\" DevicePath \"\"" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.306324 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.306344 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f47487c-2e96-41ed-963c-de6c2f8bf152-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.572551 4873 generic.go:334] "Generic (PLEG): container finished" podID="0f47487c-2e96-41ed-963c-de6c2f8bf152" containerID="41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a" exitCode=0 Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.572591 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mk7zr" event={"ID":"0f47487c-2e96-41ed-963c-de6c2f8bf152","Type":"ContainerDied","Data":"41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a"} Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.572621 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mk7zr" event={"ID":"0f47487c-2e96-41ed-963c-de6c2f8bf152","Type":"ContainerDied","Data":"49e088349e642ccc00b12a3c3c974f30b5fd5aa45f7ce3ce9abd0f16f1f209ae"} Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.572641 4873 scope.go:117] "RemoveContainer" containerID="41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.572845 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mk7zr" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.603852 4873 scope.go:117] "RemoveContainer" containerID="202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.613462 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mk7zr"] Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.628418 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mk7zr"] Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.639681 4873 scope.go:117] "RemoveContainer" containerID="31b07f779794e549307f2d11327a6f33ea37e2331b358a4ed386e94c4d86c687" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.686676 4873 scope.go:117] "RemoveContainer" containerID="41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a" Feb 19 11:05:25 crc kubenswrapper[4873]: E0219 11:05:25.687362 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a\": container with ID starting with 41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a not found: ID does not exist" containerID="41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.687421 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a"} err="failed to get container status \"41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a\": rpc error: code = NotFound desc = could not find container \"41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a\": container with ID starting with 41137e75f9f5c19cc876f43bc5ed5e45aaa46d6bad5e34f1de8feb9dc0615e8a not found: ID does not exist" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.687454 4873 scope.go:117] "RemoveContainer" containerID="202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca" Feb 19 11:05:25 crc kubenswrapper[4873]: E0219 11:05:25.687835 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca\": container with ID starting with 202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca not found: ID does not exist" containerID="202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.687862 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca"} err="failed to get container status \"202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca\": rpc error: code = NotFound desc = could not find container \"202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca\": container with ID starting with 202f50cf0f27476ee66f9c1c7fe1a62c5909a5ac115a540d13f5e5df05dd9cca not found: ID does not exist" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.687881 4873 scope.go:117] "RemoveContainer" containerID="31b07f779794e549307f2d11327a6f33ea37e2331b358a4ed386e94c4d86c687" Feb 19 11:05:25 crc kubenswrapper[4873]: E0219 11:05:25.688236 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b07f779794e549307f2d11327a6f33ea37e2331b358a4ed386e94c4d86c687\": container with ID starting with 31b07f779794e549307f2d11327a6f33ea37e2331b358a4ed386e94c4d86c687 not found: ID does not exist" containerID="31b07f779794e549307f2d11327a6f33ea37e2331b358a4ed386e94c4d86c687" Feb 19 11:05:25 crc kubenswrapper[4873]: I0219 11:05:25.688254 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b07f779794e549307f2d11327a6f33ea37e2331b358a4ed386e94c4d86c687"} err="failed to get container status \"31b07f779794e549307f2d11327a6f33ea37e2331b358a4ed386e94c4d86c687\": rpc error: code = NotFound desc = could not find container \"31b07f779794e549307f2d11327a6f33ea37e2331b358a4ed386e94c4d86c687\": container with ID starting with 31b07f779794e549307f2d11327a6f33ea37e2331b358a4ed386e94c4d86c687 not found: ID does not exist" Feb 19 11:05:27 crc kubenswrapper[4873]: I0219 11:05:27.496383 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f47487c-2e96-41ed-963c-de6c2f8bf152" path="/var/lib/kubelet/pods/0f47487c-2e96-41ed-963c-de6c2f8bf152/volumes" Feb 19 11:07:48 crc kubenswrapper[4873]: I0219 11:07:48.240633 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:07:48 crc kubenswrapper[4873]: I0219 11:07:48.241052 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:08:18 crc kubenswrapper[4873]: I0219 11:08:18.240596 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:08:18 crc kubenswrapper[4873]: I0219 11:08:18.241134 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:08:48 crc kubenswrapper[4873]: I0219 11:08:48.240610 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:08:48 crc kubenswrapper[4873]: I0219 11:08:48.242351 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:08:48 crc kubenswrapper[4873]: I0219 11:08:48.242497 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 11:08:48 crc kubenswrapper[4873]: I0219 11:08:48.243465 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 11:08:48 crc kubenswrapper[4873]: I0219 11:08:48.243675 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" gracePeriod=600 Feb 19 11:08:48 crc kubenswrapper[4873]: E0219 11:08:48.369594 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:08:48 crc kubenswrapper[4873]: I0219 11:08:48.694886 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" exitCode=0 Feb 19 11:08:48 crc kubenswrapper[4873]: I0219 11:08:48.694936 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7"} Feb 19 11:08:48 crc kubenswrapper[4873]: I0219 11:08:48.694972 4873 scope.go:117] "RemoveContainer" containerID="cb43480cbf19d4507b02920f7c71a6827821b09f9e9f251bd0c0f1803ed97739" Feb 19 11:08:48 crc kubenswrapper[4873]: I0219 11:08:48.695605 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:08:48 crc kubenswrapper[4873]: E0219 11:08:48.695898 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:09:03 crc kubenswrapper[4873]: I0219 11:09:03.495710 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:09:03 crc kubenswrapper[4873]: E0219 11:09:03.500801 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:09:17 crc kubenswrapper[4873]: I0219 11:09:17.486146 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:09:17 crc kubenswrapper[4873]: E0219 11:09:17.487178 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.263564 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jdlp2"] Feb 19 11:09:25 crc kubenswrapper[4873]: E0219 11:09:25.264514 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f47487c-2e96-41ed-963c-de6c2f8bf152" containerName="registry-server" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.264529 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f47487c-2e96-41ed-963c-de6c2f8bf152" containerName="registry-server" Feb 19 11:09:25 crc kubenswrapper[4873]: E0219 11:09:25.264550 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f47487c-2e96-41ed-963c-de6c2f8bf152" containerName="extract-content" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.264556 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f47487c-2e96-41ed-963c-de6c2f8bf152" containerName="extract-content" Feb 19 11:09:25 crc kubenswrapper[4873]: E0219 11:09:25.264588 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f47487c-2e96-41ed-963c-de6c2f8bf152" containerName="extract-utilities" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.264594 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f47487c-2e96-41ed-963c-de6c2f8bf152" containerName="extract-utilities" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.264773 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f47487c-2e96-41ed-963c-de6c2f8bf152" containerName="registry-server" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.266406 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.281005 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdlp2"] Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.414139 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-catalog-content\") pod \"certified-operators-jdlp2\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.414189 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-utilities\") pod \"certified-operators-jdlp2\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.414325 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gjwd\" (UniqueName: \"kubernetes.io/projected/84eca575-0570-46eb-9694-8646236d7aba-kube-api-access-5gjwd\") pod \"certified-operators-jdlp2\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.517423 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-catalog-content\") pod \"certified-operators-jdlp2\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.517492 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-utilities\") pod \"certified-operators-jdlp2\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.517551 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gjwd\" (UniqueName: \"kubernetes.io/projected/84eca575-0570-46eb-9694-8646236d7aba-kube-api-access-5gjwd\") pod \"certified-operators-jdlp2\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.518718 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-catalog-content\") pod \"certified-operators-jdlp2\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.519024 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-utilities\") pod \"certified-operators-jdlp2\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.537418 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gjwd\" (UniqueName: \"kubernetes.io/projected/84eca575-0570-46eb-9694-8646236d7aba-kube-api-access-5gjwd\") pod \"certified-operators-jdlp2\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:25 crc kubenswrapper[4873]: I0219 11:09:25.600010 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:26 crc kubenswrapper[4873]: I0219 11:09:26.114911 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdlp2"] Feb 19 11:09:27 crc kubenswrapper[4873]: I0219 11:09:27.036355 4873 generic.go:334] "Generic (PLEG): container finished" podID="84eca575-0570-46eb-9694-8646236d7aba" containerID="34065fcc2769694373b98c09ba55e3507a8fb714d7e6db36212106b4806b4117" exitCode=0 Feb 19 11:09:27 crc kubenswrapper[4873]: I0219 11:09:27.036767 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdlp2" event={"ID":"84eca575-0570-46eb-9694-8646236d7aba","Type":"ContainerDied","Data":"34065fcc2769694373b98c09ba55e3507a8fb714d7e6db36212106b4806b4117"} Feb 19 11:09:27 crc kubenswrapper[4873]: I0219 11:09:27.036800 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdlp2" event={"ID":"84eca575-0570-46eb-9694-8646236d7aba","Type":"ContainerStarted","Data":"4a8305bd2296efcc2dbff347d1dc9f7b7681ae97e62bc72f892f55c705a6994a"} Feb 19 11:09:29 crc kubenswrapper[4873]: I0219 11:09:29.060595 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdlp2" event={"ID":"84eca575-0570-46eb-9694-8646236d7aba","Type":"ContainerStarted","Data":"bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a"} Feb 19 11:09:29 crc kubenswrapper[4873]: I0219 11:09:29.485285 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:09:29 crc kubenswrapper[4873]: E0219 11:09:29.485937 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:09:30 crc kubenswrapper[4873]: I0219 11:09:30.071260 4873 generic.go:334] "Generic (PLEG): container finished" podID="84eca575-0570-46eb-9694-8646236d7aba" containerID="bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a" exitCode=0 Feb 19 11:09:30 crc kubenswrapper[4873]: I0219 11:09:30.071304 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdlp2" event={"ID":"84eca575-0570-46eb-9694-8646236d7aba","Type":"ContainerDied","Data":"bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a"} Feb 19 11:09:31 crc kubenswrapper[4873]: I0219 11:09:31.081548 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdlp2" event={"ID":"84eca575-0570-46eb-9694-8646236d7aba","Type":"ContainerStarted","Data":"f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871"} Feb 19 11:09:31 crc kubenswrapper[4873]: I0219 11:09:31.104453 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jdlp2" podStartSLOduration=2.653862857 podStartE2EDuration="6.104435463s" podCreationTimestamp="2026-02-19 11:09:25 +0000 UTC" firstStartedPulling="2026-02-19 11:09:27.038363537 +0000 UTC m=+5076.327795175" lastFinishedPulling="2026-02-19 11:09:30.488936143 +0000 UTC m=+5079.778367781" observedRunningTime="2026-02-19 11:09:31.096689488 +0000 UTC m=+5080.386121146" watchObservedRunningTime="2026-02-19 11:09:31.104435463 +0000 UTC m=+5080.393867101" Feb 19 11:09:35 crc kubenswrapper[4873]: I0219 11:09:35.600883 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:35 crc kubenswrapper[4873]: I0219 11:09:35.602040 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:35 crc kubenswrapper[4873]: I0219 11:09:35.648222 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:36 crc kubenswrapper[4873]: I0219 11:09:36.174654 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:36 crc kubenswrapper[4873]: I0219 11:09:36.222933 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdlp2"] Feb 19 11:09:38 crc kubenswrapper[4873]: I0219 11:09:38.151394 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jdlp2" podUID="84eca575-0570-46eb-9694-8646236d7aba" containerName="registry-server" containerID="cri-o://f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871" gracePeriod=2 Feb 19 11:09:38 crc kubenswrapper[4873]: I0219 11:09:38.649228 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:38 crc kubenswrapper[4873]: I0219 11:09:38.802314 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-catalog-content\") pod \"84eca575-0570-46eb-9694-8646236d7aba\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " Feb 19 11:09:38 crc kubenswrapper[4873]: I0219 11:09:38.802662 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-utilities\") pod \"84eca575-0570-46eb-9694-8646236d7aba\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " Feb 19 11:09:38 crc kubenswrapper[4873]: I0219 11:09:38.803809 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-utilities" (OuterVolumeSpecName: "utilities") pod "84eca575-0570-46eb-9694-8646236d7aba" (UID: "84eca575-0570-46eb-9694-8646236d7aba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:09:38 crc kubenswrapper[4873]: I0219 11:09:38.803946 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gjwd\" (UniqueName: \"kubernetes.io/projected/84eca575-0570-46eb-9694-8646236d7aba-kube-api-access-5gjwd\") pod \"84eca575-0570-46eb-9694-8646236d7aba\" (UID: \"84eca575-0570-46eb-9694-8646236d7aba\") " Feb 19 11:09:38 crc kubenswrapper[4873]: I0219 11:09:38.805191 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:09:38 crc kubenswrapper[4873]: I0219 11:09:38.810869 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84eca575-0570-46eb-9694-8646236d7aba-kube-api-access-5gjwd" (OuterVolumeSpecName: "kube-api-access-5gjwd") pod "84eca575-0570-46eb-9694-8646236d7aba" (UID: "84eca575-0570-46eb-9694-8646236d7aba"). InnerVolumeSpecName "kube-api-access-5gjwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:09:38 crc kubenswrapper[4873]: I0219 11:09:38.866272 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84eca575-0570-46eb-9694-8646236d7aba" (UID: "84eca575-0570-46eb-9694-8646236d7aba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:09:38 crc kubenswrapper[4873]: I0219 11:09:38.906677 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gjwd\" (UniqueName: \"kubernetes.io/projected/84eca575-0570-46eb-9694-8646236d7aba-kube-api-access-5gjwd\") on node \"crc\" DevicePath \"\"" Feb 19 11:09:38 crc kubenswrapper[4873]: I0219 11:09:38.906718 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84eca575-0570-46eb-9694-8646236d7aba-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.161907 4873 generic.go:334] "Generic (PLEG): container finished" podID="84eca575-0570-46eb-9694-8646236d7aba" containerID="f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871" exitCode=0 Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.161957 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdlp2" event={"ID":"84eca575-0570-46eb-9694-8646236d7aba","Type":"ContainerDied","Data":"f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871"} Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.161988 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdlp2" event={"ID":"84eca575-0570-46eb-9694-8646236d7aba","Type":"ContainerDied","Data":"4a8305bd2296efcc2dbff347d1dc9f7b7681ae97e62bc72f892f55c705a6994a"} Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.162007 4873 scope.go:117] "RemoveContainer" containerID="f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871" Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.162177 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdlp2" Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.191055 4873 scope.go:117] "RemoveContainer" containerID="bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a" Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.220312 4873 scope.go:117] "RemoveContainer" containerID="34065fcc2769694373b98c09ba55e3507a8fb714d7e6db36212106b4806b4117" Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.228560 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdlp2"] Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.237885 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jdlp2"] Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.282210 4873 scope.go:117] "RemoveContainer" containerID="f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871" Feb 19 11:09:39 crc kubenswrapper[4873]: E0219 11:09:39.282658 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871\": container with ID starting with f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871 not found: ID does not exist" containerID="f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871" Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.282782 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871"} err="failed to get container status \"f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871\": rpc error: code = NotFound desc = could not find container \"f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871\": container with ID starting with f71393ccfb697c7f860c7995ccd8ba3b74dd9fa62fe2d5e21c727436b4f3d871 not found: ID does not exist" Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.282913 4873 scope.go:117] "RemoveContainer" containerID="bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a" Feb 19 11:09:39 crc kubenswrapper[4873]: E0219 11:09:39.282837 4873 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84eca575_0570_46eb_9694_8646236d7aba.slice/crio-4a8305bd2296efcc2dbff347d1dc9f7b7681ae97e62bc72f892f55c705a6994a\": RecentStats: unable to find data in memory cache]" Feb 19 11:09:39 crc kubenswrapper[4873]: E0219 11:09:39.283440 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a\": container with ID starting with bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a not found: ID does not exist" containerID="bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a" Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.283469 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a"} err="failed to get container status \"bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a\": rpc error: code = NotFound desc = could not find container \"bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a\": container with ID starting with bd4a9f0919f6049daf2c8d424e66fe24bb3a5c9ea29b2e45e7d71be383355a3a not found: ID does not exist" Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.283487 4873 scope.go:117] "RemoveContainer" containerID="34065fcc2769694373b98c09ba55e3507a8fb714d7e6db36212106b4806b4117" Feb 19 11:09:39 crc kubenswrapper[4873]: E0219 11:09:39.283947 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34065fcc2769694373b98c09ba55e3507a8fb714d7e6db36212106b4806b4117\": container with ID starting with 34065fcc2769694373b98c09ba55e3507a8fb714d7e6db36212106b4806b4117 not found: ID does not exist" containerID="34065fcc2769694373b98c09ba55e3507a8fb714d7e6db36212106b4806b4117" Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.284011 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34065fcc2769694373b98c09ba55e3507a8fb714d7e6db36212106b4806b4117"} err="failed to get container status \"34065fcc2769694373b98c09ba55e3507a8fb714d7e6db36212106b4806b4117\": rpc error: code = NotFound desc = could not find container \"34065fcc2769694373b98c09ba55e3507a8fb714d7e6db36212106b4806b4117\": container with ID starting with 34065fcc2769694373b98c09ba55e3507a8fb714d7e6db36212106b4806b4117 not found: ID does not exist" Feb 19 11:09:39 crc kubenswrapper[4873]: I0219 11:09:39.499395 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84eca575-0570-46eb-9694-8646236d7aba" path="/var/lib/kubelet/pods/84eca575-0570-46eb-9694-8646236d7aba/volumes" Feb 19 11:09:43 crc kubenswrapper[4873]: I0219 11:09:43.484788 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:09:43 crc kubenswrapper[4873]: E0219 11:09:43.485878 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:09:58 crc kubenswrapper[4873]: I0219 11:09:58.484551 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:09:58 crc kubenswrapper[4873]: E0219 11:09:58.485213 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:10:13 crc kubenswrapper[4873]: I0219 11:10:13.488135 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:10:13 crc kubenswrapper[4873]: E0219 11:10:13.488959 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:10:24 crc kubenswrapper[4873]: I0219 11:10:24.484861 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:10:24 crc kubenswrapper[4873]: E0219 11:10:24.485677 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:10:29 crc kubenswrapper[4873]: I0219 11:10:29.640811 4873 generic.go:334] "Generic (PLEG): container finished" podID="5e5a79da-a068-4a68-ba79-6719ea0fb353" containerID="edd9b7584d145cddbcf9d8449ca8d5546aa8224b7f3731235eeab85ccb091862" exitCode=0 Feb 19 11:10:29 crc kubenswrapper[4873]: I0219 11:10:29.640900 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e5a79da-a068-4a68-ba79-6719ea0fb353","Type":"ContainerDied","Data":"edd9b7584d145cddbcf9d8449ca8d5546aa8224b7f3731235eeab85ccb091862"} Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.005723 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.119706 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"5e5a79da-a068-4a68-ba79-6719ea0fb353\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.119812 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config\") pod \"5e5a79da-a068-4a68-ba79-6719ea0fb353\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.119878 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ssh-key\") pod \"5e5a79da-a068-4a68-ba79-6719ea0fb353\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.119931 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-config-data\") pod \"5e5a79da-a068-4a68-ba79-6719ea0fb353\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.119973 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-temporary\") pod \"5e5a79da-a068-4a68-ba79-6719ea0fb353\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.120021 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ca-certs\") pod \"5e5a79da-a068-4a68-ba79-6719ea0fb353\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.120064 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-workdir\") pod \"5e5a79da-a068-4a68-ba79-6719ea0fb353\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.120091 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config-secret\") pod \"5e5a79da-a068-4a68-ba79-6719ea0fb353\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.120177 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdqm6\" (UniqueName: \"kubernetes.io/projected/5e5a79da-a068-4a68-ba79-6719ea0fb353-kube-api-access-hdqm6\") pod \"5e5a79da-a068-4a68-ba79-6719ea0fb353\" (UID: \"5e5a79da-a068-4a68-ba79-6719ea0fb353\") " Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.120682 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5e5a79da-a068-4a68-ba79-6719ea0fb353" (UID: "5e5a79da-a068-4a68-ba79-6719ea0fb353"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.120839 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-config-data" (OuterVolumeSpecName: "config-data") pod "5e5a79da-a068-4a68-ba79-6719ea0fb353" (UID: "5e5a79da-a068-4a68-ba79-6719ea0fb353"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.128369 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e5a79da-a068-4a68-ba79-6719ea0fb353-kube-api-access-hdqm6" (OuterVolumeSpecName: "kube-api-access-hdqm6") pod "5e5a79da-a068-4a68-ba79-6719ea0fb353" (UID: "5e5a79da-a068-4a68-ba79-6719ea0fb353"). InnerVolumeSpecName "kube-api-access-hdqm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.129649 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5e5a79da-a068-4a68-ba79-6719ea0fb353" (UID: "5e5a79da-a068-4a68-ba79-6719ea0fb353"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.130827 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5e5a79da-a068-4a68-ba79-6719ea0fb353" (UID: "5e5a79da-a068-4a68-ba79-6719ea0fb353"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.151923 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5e5a79da-a068-4a68-ba79-6719ea0fb353" (UID: "5e5a79da-a068-4a68-ba79-6719ea0fb353"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.155185 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5e5a79da-a068-4a68-ba79-6719ea0fb353" (UID: "5e5a79da-a068-4a68-ba79-6719ea0fb353"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.155676 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5e5a79da-a068-4a68-ba79-6719ea0fb353" (UID: "5e5a79da-a068-4a68-ba79-6719ea0fb353"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.182733 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5e5a79da-a068-4a68-ba79-6719ea0fb353" (UID: "5e5a79da-a068-4a68-ba79-6719ea0fb353"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.222391 4873 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.222431 4873 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.222443 4873 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.222452 4873 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e5a79da-a068-4a68-ba79-6719ea0fb353-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.222463 4873 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.222480 4873 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.222495 4873 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5e5a79da-a068-4a68-ba79-6719ea0fb353-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.222511 4873 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e5a79da-a068-4a68-ba79-6719ea0fb353-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.222536 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdqm6\" (UniqueName: \"kubernetes.io/projected/5e5a79da-a068-4a68-ba79-6719ea0fb353-kube-api-access-hdqm6\") on node \"crc\" DevicePath \"\"" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.246736 4873 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.325475 4873 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.662603 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5e5a79da-a068-4a68-ba79-6719ea0fb353","Type":"ContainerDied","Data":"2ab42e52f993d6514497f49e7da17659fa93e4ac5da7295a0f0f52c753b83b71"} Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.662966 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ab42e52f993d6514497f49e7da17659fa93e4ac5da7295a0f0f52c753b83b71" Feb 19 11:10:31 crc kubenswrapper[4873]: I0219 11:10:31.662638 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 11:10:37 crc kubenswrapper[4873]: I0219 11:10:37.485194 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:10:37 crc kubenswrapper[4873]: E0219 11:10:37.486516 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.092290 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 11:10:41 crc kubenswrapper[4873]: E0219 11:10:41.093357 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84eca575-0570-46eb-9694-8646236d7aba" containerName="registry-server" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.093373 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="84eca575-0570-46eb-9694-8646236d7aba" containerName="registry-server" Feb 19 11:10:41 crc kubenswrapper[4873]: E0219 11:10:41.093398 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84eca575-0570-46eb-9694-8646236d7aba" containerName="extract-content" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.093406 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="84eca575-0570-46eb-9694-8646236d7aba" containerName="extract-content" Feb 19 11:10:41 crc kubenswrapper[4873]: E0219 11:10:41.093417 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5a79da-a068-4a68-ba79-6719ea0fb353" containerName="tempest-tests-tempest-tests-runner" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.093425 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5a79da-a068-4a68-ba79-6719ea0fb353" containerName="tempest-tests-tempest-tests-runner" Feb 19 11:10:41 crc kubenswrapper[4873]: E0219 11:10:41.093465 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84eca575-0570-46eb-9694-8646236d7aba" containerName="extract-utilities" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.093475 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="84eca575-0570-46eb-9694-8646236d7aba" containerName="extract-utilities" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.093697 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="84eca575-0570-46eb-9694-8646236d7aba" containerName="registry-server" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.093729 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e5a79da-a068-4a68-ba79-6719ea0fb353" containerName="tempest-tests-tempest-tests-runner" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.094534 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.096895 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5bdht" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.102575 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.205536 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nwhs\" (UniqueName: \"kubernetes.io/projected/58738a83-0734-4889-9b0e-650e43f6dbb7-kube-api-access-5nwhs\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"58738a83-0734-4889-9b0e-650e43f6dbb7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.205617 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"58738a83-0734-4889-9b0e-650e43f6dbb7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.308681 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nwhs\" (UniqueName: \"kubernetes.io/projected/58738a83-0734-4889-9b0e-650e43f6dbb7-kube-api-access-5nwhs\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"58738a83-0734-4889-9b0e-650e43f6dbb7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.308880 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"58738a83-0734-4889-9b0e-650e43f6dbb7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.309500 4873 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"58738a83-0734-4889-9b0e-650e43f6dbb7\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.332783 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nwhs\" (UniqueName: \"kubernetes.io/projected/58738a83-0734-4889-9b0e-650e43f6dbb7-kube-api-access-5nwhs\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"58738a83-0734-4889-9b0e-650e43f6dbb7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.343280 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"58738a83-0734-4889-9b0e-650e43f6dbb7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.418937 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.838540 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 11:10:41 crc kubenswrapper[4873]: I0219 11:10:41.846944 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 11:10:42 crc kubenswrapper[4873]: I0219 11:10:42.778962 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"58738a83-0734-4889-9b0e-650e43f6dbb7","Type":"ContainerStarted","Data":"eaf32f57e0eb0c863d6da631f84ac1ca33473332d8b0e93ea2bb36f2f78e6202"} Feb 19 11:10:44 crc kubenswrapper[4873]: I0219 11:10:44.159557 4873 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-prw4c" podUID="4cc54252-cfdf-4b71-bfa5-552dcd26500d" containerName="registry-server" probeResult="failure" output=< Feb 19 11:10:44 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 11:10:44 crc kubenswrapper[4873]: > Feb 19 11:10:44 crc kubenswrapper[4873]: I0219 11:10:44.164635 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-prw4c" podUID="4cc54252-cfdf-4b71-bfa5-552dcd26500d" containerName="registry-server" probeResult="failure" output=< Feb 19 11:10:44 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 11:10:44 crc kubenswrapper[4873]: > Feb 19 11:10:45 crc kubenswrapper[4873]: I0219 11:10:45.808553 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"58738a83-0734-4889-9b0e-650e43f6dbb7","Type":"ContainerStarted","Data":"23b91139685a6cdb90a8794eb5accb2a7251f0de1685199bd21c49c5f88f9f84"} Feb 19 11:10:45 crc kubenswrapper[4873]: I0219 11:10:45.831452 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.985016231 podStartE2EDuration="4.831415835s" podCreationTimestamp="2026-02-19 11:10:41 +0000 UTC" firstStartedPulling="2026-02-19 11:10:41.84675338 +0000 UTC m=+5151.136185018" lastFinishedPulling="2026-02-19 11:10:44.693152984 +0000 UTC m=+5153.982584622" observedRunningTime="2026-02-19 11:10:45.822378538 +0000 UTC m=+5155.111810176" watchObservedRunningTime="2026-02-19 11:10:45.831415835 +0000 UTC m=+5155.120847513" Feb 19 11:10:48 crc kubenswrapper[4873]: I0219 11:10:48.485307 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:10:48 crc kubenswrapper[4873]: E0219 11:10:48.485922 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:11:01 crc kubenswrapper[4873]: I0219 11:11:01.490547 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:11:01 crc kubenswrapper[4873]: E0219 11:11:01.491356 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.415033 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lms9s/must-gather-lgwst"] Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.417180 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/must-gather-lgwst" Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.420584 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lms9s"/"openshift-service-ca.crt" Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.420936 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lms9s"/"default-dockercfg-6s6vh" Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.422173 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lms9s"/"kube-root-ca.crt" Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.431793 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9b49\" (UniqueName: \"kubernetes.io/projected/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-kube-api-access-n9b49\") pod \"must-gather-lgwst\" (UID: \"a6f4f1cb-6b34-4940-be18-6ba992fd72d7\") " pod="openshift-must-gather-lms9s/must-gather-lgwst" Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.431938 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-must-gather-output\") pod \"must-gather-lgwst\" (UID: \"a6f4f1cb-6b34-4940-be18-6ba992fd72d7\") " pod="openshift-must-gather-lms9s/must-gather-lgwst" Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.439651 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lms9s/must-gather-lgwst"] Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.536379 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9b49\" (UniqueName: \"kubernetes.io/projected/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-kube-api-access-n9b49\") pod \"must-gather-lgwst\" (UID: \"a6f4f1cb-6b34-4940-be18-6ba992fd72d7\") " pod="openshift-must-gather-lms9s/must-gather-lgwst" Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.540224 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-must-gather-output\") pod \"must-gather-lgwst\" (UID: \"a6f4f1cb-6b34-4940-be18-6ba992fd72d7\") " pod="openshift-must-gather-lms9s/must-gather-lgwst" Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.540724 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-must-gather-output\") pod \"must-gather-lgwst\" (UID: \"a6f4f1cb-6b34-4940-be18-6ba992fd72d7\") " pod="openshift-must-gather-lms9s/must-gather-lgwst" Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.555329 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9b49\" (UniqueName: \"kubernetes.io/projected/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-kube-api-access-n9b49\") pod \"must-gather-lgwst\" (UID: \"a6f4f1cb-6b34-4940-be18-6ba992fd72d7\") " pod="openshift-must-gather-lms9s/must-gather-lgwst" Feb 19 11:11:09 crc kubenswrapper[4873]: I0219 11:11:09.737866 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/must-gather-lgwst" Feb 19 11:11:10 crc kubenswrapper[4873]: I0219 11:11:10.425502 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lms9s/must-gather-lgwst"] Feb 19 11:11:11 crc kubenswrapper[4873]: I0219 11:11:11.103559 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/must-gather-lgwst" event={"ID":"a6f4f1cb-6b34-4940-be18-6ba992fd72d7","Type":"ContainerStarted","Data":"80a48f2e0ac5a77d4e93a660520e3615ef4baebafeaa87193cb6260f26588930"} Feb 19 11:11:16 crc kubenswrapper[4873]: I0219 11:11:16.484620 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:11:16 crc kubenswrapper[4873]: E0219 11:11:16.485438 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:11:17 crc kubenswrapper[4873]: I0219 11:11:17.175795 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/must-gather-lgwst" event={"ID":"a6f4f1cb-6b34-4940-be18-6ba992fd72d7","Type":"ContainerStarted","Data":"d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620"} Feb 19 11:11:17 crc kubenswrapper[4873]: I0219 11:11:17.176353 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/must-gather-lgwst" event={"ID":"a6f4f1cb-6b34-4940-be18-6ba992fd72d7","Type":"ContainerStarted","Data":"ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7"} Feb 19 11:11:17 crc kubenswrapper[4873]: I0219 11:11:17.192802 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lms9s/must-gather-lgwst" podStartSLOduration=2.193241735 podStartE2EDuration="8.192783645s" podCreationTimestamp="2026-02-19 11:11:09 +0000 UTC" firstStartedPulling="2026-02-19 11:11:10.444779365 +0000 UTC m=+5179.734211003" lastFinishedPulling="2026-02-19 11:11:16.444321275 +0000 UTC m=+5185.733752913" observedRunningTime="2026-02-19 11:11:17.192417296 +0000 UTC m=+5186.481848934" watchObservedRunningTime="2026-02-19 11:11:17.192783645 +0000 UTC m=+5186.482215293" Feb 19 11:11:22 crc kubenswrapper[4873]: I0219 11:11:22.228538 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lms9s/crc-debug-gjmz2"] Feb 19 11:11:22 crc kubenswrapper[4873]: I0219 11:11:22.230479 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-gjmz2" Feb 19 11:11:22 crc kubenswrapper[4873]: I0219 11:11:22.313285 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wrzd\" (UniqueName: \"kubernetes.io/projected/610332ca-5405-4232-bdf7-e716c30e4e29-kube-api-access-2wrzd\") pod \"crc-debug-gjmz2\" (UID: \"610332ca-5405-4232-bdf7-e716c30e4e29\") " pod="openshift-must-gather-lms9s/crc-debug-gjmz2" Feb 19 11:11:22 crc kubenswrapper[4873]: I0219 11:11:22.313736 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/610332ca-5405-4232-bdf7-e716c30e4e29-host\") pod \"crc-debug-gjmz2\" (UID: \"610332ca-5405-4232-bdf7-e716c30e4e29\") " pod="openshift-must-gather-lms9s/crc-debug-gjmz2" Feb 19 11:11:22 crc kubenswrapper[4873]: I0219 11:11:22.415956 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wrzd\" (UniqueName: \"kubernetes.io/projected/610332ca-5405-4232-bdf7-e716c30e4e29-kube-api-access-2wrzd\") pod \"crc-debug-gjmz2\" (UID: \"610332ca-5405-4232-bdf7-e716c30e4e29\") " pod="openshift-must-gather-lms9s/crc-debug-gjmz2" Feb 19 11:11:22 crc kubenswrapper[4873]: I0219 11:11:22.416061 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/610332ca-5405-4232-bdf7-e716c30e4e29-host\") pod \"crc-debug-gjmz2\" (UID: \"610332ca-5405-4232-bdf7-e716c30e4e29\") " pod="openshift-must-gather-lms9s/crc-debug-gjmz2" Feb 19 11:11:22 crc kubenswrapper[4873]: I0219 11:11:22.416180 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/610332ca-5405-4232-bdf7-e716c30e4e29-host\") pod \"crc-debug-gjmz2\" (UID: \"610332ca-5405-4232-bdf7-e716c30e4e29\") " pod="openshift-must-gather-lms9s/crc-debug-gjmz2" Feb 19 11:11:22 crc kubenswrapper[4873]: I0219 11:11:22.438857 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wrzd\" (UniqueName: \"kubernetes.io/projected/610332ca-5405-4232-bdf7-e716c30e4e29-kube-api-access-2wrzd\") pod \"crc-debug-gjmz2\" (UID: \"610332ca-5405-4232-bdf7-e716c30e4e29\") " pod="openshift-must-gather-lms9s/crc-debug-gjmz2" Feb 19 11:11:22 crc kubenswrapper[4873]: I0219 11:11:22.549580 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-gjmz2" Feb 19 11:11:23 crc kubenswrapper[4873]: I0219 11:11:23.227713 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/crc-debug-gjmz2" event={"ID":"610332ca-5405-4232-bdf7-e716c30e4e29","Type":"ContainerStarted","Data":"4565dec49529c796afa90816eae8a0e8246af5cab44bb9b4785466235f31f90e"} Feb 19 11:11:30 crc kubenswrapper[4873]: I0219 11:11:30.484495 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:11:30 crc kubenswrapper[4873]: E0219 11:11:30.485337 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:11:35 crc kubenswrapper[4873]: I0219 11:11:35.349938 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/crc-debug-gjmz2" event={"ID":"610332ca-5405-4232-bdf7-e716c30e4e29","Type":"ContainerStarted","Data":"e1e47bb1ff8a672d572432525165620c6fe8cdbb5878839e48d767115053def9"} Feb 19 11:11:35 crc kubenswrapper[4873]: I0219 11:11:35.371269 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lms9s/crc-debug-gjmz2" podStartSLOduration=0.986640113 podStartE2EDuration="13.371252908s" podCreationTimestamp="2026-02-19 11:11:22 +0000 UTC" firstStartedPulling="2026-02-19 11:11:22.610977897 +0000 UTC m=+5191.900409535" lastFinishedPulling="2026-02-19 11:11:34.995590692 +0000 UTC m=+5204.285022330" observedRunningTime="2026-02-19 11:11:35.361156305 +0000 UTC m=+5204.650587983" watchObservedRunningTime="2026-02-19 11:11:35.371252908 +0000 UTC m=+5204.660684546" Feb 19 11:11:41 crc kubenswrapper[4873]: I0219 11:11:41.493157 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:11:41 crc kubenswrapper[4873]: E0219 11:11:41.493999 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:11:55 crc kubenswrapper[4873]: I0219 11:11:55.485009 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:11:55 crc kubenswrapper[4873]: E0219 11:11:55.485793 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:12:10 crc kubenswrapper[4873]: I0219 11:12:10.484333 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:12:10 crc kubenswrapper[4873]: E0219 11:12:10.485085 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:12:21 crc kubenswrapper[4873]: I0219 11:12:21.496634 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:12:21 crc kubenswrapper[4873]: E0219 11:12:21.497493 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:12:24 crc kubenswrapper[4873]: I0219 11:12:24.895516 4873 generic.go:334] "Generic (PLEG): container finished" podID="610332ca-5405-4232-bdf7-e716c30e4e29" containerID="e1e47bb1ff8a672d572432525165620c6fe8cdbb5878839e48d767115053def9" exitCode=0 Feb 19 11:12:24 crc kubenswrapper[4873]: I0219 11:12:24.895595 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/crc-debug-gjmz2" event={"ID":"610332ca-5405-4232-bdf7-e716c30e4e29","Type":"ContainerDied","Data":"e1e47bb1ff8a672d572432525165620c6fe8cdbb5878839e48d767115053def9"} Feb 19 11:12:26 crc kubenswrapper[4873]: I0219 11:12:26.609692 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-gjmz2" Feb 19 11:12:26 crc kubenswrapper[4873]: I0219 11:12:26.664567 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lms9s/crc-debug-gjmz2"] Feb 19 11:12:26 crc kubenswrapper[4873]: I0219 11:12:26.675249 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lms9s/crc-debug-gjmz2"] Feb 19 11:12:26 crc kubenswrapper[4873]: I0219 11:12:26.696607 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wrzd\" (UniqueName: \"kubernetes.io/projected/610332ca-5405-4232-bdf7-e716c30e4e29-kube-api-access-2wrzd\") pod \"610332ca-5405-4232-bdf7-e716c30e4e29\" (UID: \"610332ca-5405-4232-bdf7-e716c30e4e29\") " Feb 19 11:12:26 crc kubenswrapper[4873]: I0219 11:12:26.696680 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/610332ca-5405-4232-bdf7-e716c30e4e29-host\") pod \"610332ca-5405-4232-bdf7-e716c30e4e29\" (UID: \"610332ca-5405-4232-bdf7-e716c30e4e29\") " Feb 19 11:12:26 crc kubenswrapper[4873]: I0219 11:12:26.696853 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/610332ca-5405-4232-bdf7-e716c30e4e29-host" (OuterVolumeSpecName: "host") pod "610332ca-5405-4232-bdf7-e716c30e4e29" (UID: "610332ca-5405-4232-bdf7-e716c30e4e29"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 11:12:26 crc kubenswrapper[4873]: I0219 11:12:26.697420 4873 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/610332ca-5405-4232-bdf7-e716c30e4e29-host\") on node \"crc\" DevicePath \"\"" Feb 19 11:12:26 crc kubenswrapper[4873]: I0219 11:12:26.705992 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/610332ca-5405-4232-bdf7-e716c30e4e29-kube-api-access-2wrzd" (OuterVolumeSpecName: "kube-api-access-2wrzd") pod "610332ca-5405-4232-bdf7-e716c30e4e29" (UID: "610332ca-5405-4232-bdf7-e716c30e4e29"). InnerVolumeSpecName "kube-api-access-2wrzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:12:26 crc kubenswrapper[4873]: I0219 11:12:26.801534 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wrzd\" (UniqueName: \"kubernetes.io/projected/610332ca-5405-4232-bdf7-e716c30e4e29-kube-api-access-2wrzd\") on node \"crc\" DevicePath \"\"" Feb 19 11:12:26 crc kubenswrapper[4873]: I0219 11:12:26.918025 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4565dec49529c796afa90816eae8a0e8246af5cab44bb9b4785466235f31f90e" Feb 19 11:12:26 crc kubenswrapper[4873]: I0219 11:12:26.918341 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-gjmz2" Feb 19 11:12:27 crc kubenswrapper[4873]: I0219 11:12:27.494281 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="610332ca-5405-4232-bdf7-e716c30e4e29" path="/var/lib/kubelet/pods/610332ca-5405-4232-bdf7-e716c30e4e29/volumes" Feb 19 11:12:27 crc kubenswrapper[4873]: I0219 11:12:27.787516 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lms9s/crc-debug-58t8c"] Feb 19 11:12:27 crc kubenswrapper[4873]: E0219 11:12:27.788047 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="610332ca-5405-4232-bdf7-e716c30e4e29" containerName="container-00" Feb 19 11:12:27 crc kubenswrapper[4873]: I0219 11:12:27.788065 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="610332ca-5405-4232-bdf7-e716c30e4e29" containerName="container-00" Feb 19 11:12:27 crc kubenswrapper[4873]: I0219 11:12:27.788317 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="610332ca-5405-4232-bdf7-e716c30e4e29" containerName="container-00" Feb 19 11:12:27 crc kubenswrapper[4873]: I0219 11:12:27.789159 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-58t8c" Feb 19 11:12:27 crc kubenswrapper[4873]: I0219 11:12:27.936234 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-host\") pod \"crc-debug-58t8c\" (UID: \"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a\") " pod="openshift-must-gather-lms9s/crc-debug-58t8c" Feb 19 11:12:27 crc kubenswrapper[4873]: I0219 11:12:27.936300 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6j7p\" (UniqueName: \"kubernetes.io/projected/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-kube-api-access-b6j7p\") pod \"crc-debug-58t8c\" (UID: \"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a\") " pod="openshift-must-gather-lms9s/crc-debug-58t8c" Feb 19 11:12:28 crc kubenswrapper[4873]: I0219 11:12:28.038199 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-host\") pod \"crc-debug-58t8c\" (UID: \"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a\") " pod="openshift-must-gather-lms9s/crc-debug-58t8c" Feb 19 11:12:28 crc kubenswrapper[4873]: I0219 11:12:28.038340 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-host\") pod \"crc-debug-58t8c\" (UID: \"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a\") " pod="openshift-must-gather-lms9s/crc-debug-58t8c" Feb 19 11:12:28 crc kubenswrapper[4873]: I0219 11:12:28.038641 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6j7p\" (UniqueName: \"kubernetes.io/projected/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-kube-api-access-b6j7p\") pod \"crc-debug-58t8c\" (UID: \"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a\") " pod="openshift-must-gather-lms9s/crc-debug-58t8c" Feb 19 11:12:28 crc kubenswrapper[4873]: I0219 11:12:28.065093 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6j7p\" (UniqueName: \"kubernetes.io/projected/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-kube-api-access-b6j7p\") pod \"crc-debug-58t8c\" (UID: \"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a\") " pod="openshift-must-gather-lms9s/crc-debug-58t8c" Feb 19 11:12:28 crc kubenswrapper[4873]: I0219 11:12:28.115572 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-58t8c" Feb 19 11:12:28 crc kubenswrapper[4873]: I0219 11:12:28.944871 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/crc-debug-58t8c" event={"ID":"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a","Type":"ContainerStarted","Data":"9881fb46a0d2568f8dddfc8da179e6f97035508c7f4131e1a48dad7a6c2fd139"} Feb 19 11:12:28 crc kubenswrapper[4873]: I0219 11:12:28.945295 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/crc-debug-58t8c" event={"ID":"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a","Type":"ContainerStarted","Data":"c3a9a5b478958c6bcc89f1f4f518abb55bf2fac1e192b323dbd5a87dee63bee8"} Feb 19 11:12:28 crc kubenswrapper[4873]: I0219 11:12:28.978824 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lms9s/crc-debug-58t8c" podStartSLOduration=1.9788021850000002 podStartE2EDuration="1.978802185s" podCreationTimestamp="2026-02-19 11:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 11:12:28.95871427 +0000 UTC m=+5258.248145908" watchObservedRunningTime="2026-02-19 11:12:28.978802185 +0000 UTC m=+5258.268233823" Feb 19 11:12:29 crc kubenswrapper[4873]: I0219 11:12:29.954579 4873 generic.go:334] "Generic (PLEG): container finished" podID="ab4a09e3-7ea1-4764-b0e6-75c1b549b77a" containerID="9881fb46a0d2568f8dddfc8da179e6f97035508c7f4131e1a48dad7a6c2fd139" exitCode=0 Feb 19 11:12:29 crc kubenswrapper[4873]: I0219 11:12:29.954634 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/crc-debug-58t8c" event={"ID":"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a","Type":"ContainerDied","Data":"9881fb46a0d2568f8dddfc8da179e6f97035508c7f4131e1a48dad7a6c2fd139"} Feb 19 11:12:31 crc kubenswrapper[4873]: I0219 11:12:31.089205 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-58t8c" Feb 19 11:12:31 crc kubenswrapper[4873]: I0219 11:12:31.200465 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-host\") pod \"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a\" (UID: \"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a\") " Feb 19 11:12:31 crc kubenswrapper[4873]: I0219 11:12:31.200759 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-host" (OuterVolumeSpecName: "host") pod "ab4a09e3-7ea1-4764-b0e6-75c1b549b77a" (UID: "ab4a09e3-7ea1-4764-b0e6-75c1b549b77a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 11:12:31 crc kubenswrapper[4873]: I0219 11:12:31.201340 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6j7p\" (UniqueName: \"kubernetes.io/projected/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-kube-api-access-b6j7p\") pod \"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a\" (UID: \"ab4a09e3-7ea1-4764-b0e6-75c1b549b77a\") " Feb 19 11:12:31 crc kubenswrapper[4873]: I0219 11:12:31.202412 4873 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-host\") on node \"crc\" DevicePath \"\"" Feb 19 11:12:31 crc kubenswrapper[4873]: I0219 11:12:31.224477 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-kube-api-access-b6j7p" (OuterVolumeSpecName: "kube-api-access-b6j7p") pod "ab4a09e3-7ea1-4764-b0e6-75c1b549b77a" (UID: "ab4a09e3-7ea1-4764-b0e6-75c1b549b77a"). InnerVolumeSpecName "kube-api-access-b6j7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:12:31 crc kubenswrapper[4873]: I0219 11:12:31.304784 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6j7p\" (UniqueName: \"kubernetes.io/projected/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a-kube-api-access-b6j7p\") on node \"crc\" DevicePath \"\"" Feb 19 11:12:31 crc kubenswrapper[4873]: I0219 11:12:31.576407 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lms9s/crc-debug-58t8c"] Feb 19 11:12:31 crc kubenswrapper[4873]: I0219 11:12:31.585039 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lms9s/crc-debug-58t8c"] Feb 19 11:12:31 crc kubenswrapper[4873]: I0219 11:12:31.977136 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3a9a5b478958c6bcc89f1f4f518abb55bf2fac1e192b323dbd5a87dee63bee8" Feb 19 11:12:31 crc kubenswrapper[4873]: I0219 11:12:31.977210 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-58t8c" Feb 19 11:12:32 crc kubenswrapper[4873]: I0219 11:12:32.769296 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lms9s/crc-debug-6mcmb"] Feb 19 11:12:32 crc kubenswrapper[4873]: E0219 11:12:32.769861 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4a09e3-7ea1-4764-b0e6-75c1b549b77a" containerName="container-00" Feb 19 11:12:32 crc kubenswrapper[4873]: I0219 11:12:32.769876 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4a09e3-7ea1-4764-b0e6-75c1b549b77a" containerName="container-00" Feb 19 11:12:32 crc kubenswrapper[4873]: I0219 11:12:32.770145 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab4a09e3-7ea1-4764-b0e6-75c1b549b77a" containerName="container-00" Feb 19 11:12:32 crc kubenswrapper[4873]: I0219 11:12:32.771539 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-6mcmb" Feb 19 11:12:32 crc kubenswrapper[4873]: I0219 11:12:32.838322 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b446129e-ed0f-4d69-b8e6-4080c69ec21b-host\") pod \"crc-debug-6mcmb\" (UID: \"b446129e-ed0f-4d69-b8e6-4080c69ec21b\") " pod="openshift-must-gather-lms9s/crc-debug-6mcmb" Feb 19 11:12:32 crc kubenswrapper[4873]: I0219 11:12:32.838792 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbbjg\" (UniqueName: \"kubernetes.io/projected/b446129e-ed0f-4d69-b8e6-4080c69ec21b-kube-api-access-cbbjg\") pod \"crc-debug-6mcmb\" (UID: \"b446129e-ed0f-4d69-b8e6-4080c69ec21b\") " pod="openshift-must-gather-lms9s/crc-debug-6mcmb" Feb 19 11:12:32 crc kubenswrapper[4873]: I0219 11:12:32.940905 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b446129e-ed0f-4d69-b8e6-4080c69ec21b-host\") pod \"crc-debug-6mcmb\" (UID: \"b446129e-ed0f-4d69-b8e6-4080c69ec21b\") " pod="openshift-must-gather-lms9s/crc-debug-6mcmb" Feb 19 11:12:32 crc kubenswrapper[4873]: I0219 11:12:32.941074 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbbjg\" (UniqueName: \"kubernetes.io/projected/b446129e-ed0f-4d69-b8e6-4080c69ec21b-kube-api-access-cbbjg\") pod \"crc-debug-6mcmb\" (UID: \"b446129e-ed0f-4d69-b8e6-4080c69ec21b\") " pod="openshift-must-gather-lms9s/crc-debug-6mcmb" Feb 19 11:12:32 crc kubenswrapper[4873]: I0219 11:12:32.941084 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b446129e-ed0f-4d69-b8e6-4080c69ec21b-host\") pod \"crc-debug-6mcmb\" (UID: \"b446129e-ed0f-4d69-b8e6-4080c69ec21b\") " pod="openshift-must-gather-lms9s/crc-debug-6mcmb" Feb 19 11:12:32 crc kubenswrapper[4873]: I0219 11:12:32.962044 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbbjg\" (UniqueName: \"kubernetes.io/projected/b446129e-ed0f-4d69-b8e6-4080c69ec21b-kube-api-access-cbbjg\") pod \"crc-debug-6mcmb\" (UID: \"b446129e-ed0f-4d69-b8e6-4080c69ec21b\") " pod="openshift-must-gather-lms9s/crc-debug-6mcmb" Feb 19 11:12:33 crc kubenswrapper[4873]: I0219 11:12:33.096010 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-6mcmb" Feb 19 11:12:33 crc kubenswrapper[4873]: W0219 11:12:33.142931 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb446129e_ed0f_4d69_b8e6_4080c69ec21b.slice/crio-e3d5e07c91a322a359e34accd84aed9f05e4c4c951146784683e31d5d331c1e8 WatchSource:0}: Error finding container e3d5e07c91a322a359e34accd84aed9f05e4c4c951146784683e31d5d331c1e8: Status 404 returned error can't find the container with id e3d5e07c91a322a359e34accd84aed9f05e4c4c951146784683e31d5d331c1e8 Feb 19 11:12:33 crc kubenswrapper[4873]: I0219 11:12:33.497058 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab4a09e3-7ea1-4764-b0e6-75c1b549b77a" path="/var/lib/kubelet/pods/ab4a09e3-7ea1-4764-b0e6-75c1b549b77a/volumes" Feb 19 11:12:33 crc kubenswrapper[4873]: I0219 11:12:33.995401 4873 generic.go:334] "Generic (PLEG): container finished" podID="b446129e-ed0f-4d69-b8e6-4080c69ec21b" containerID="40d299159e87f29a6a7e1c52a9e8d2f2733e3bb33ce23f6b1515f7d41477be8f" exitCode=0 Feb 19 11:12:33 crc kubenswrapper[4873]: I0219 11:12:33.995476 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/crc-debug-6mcmb" event={"ID":"b446129e-ed0f-4d69-b8e6-4080c69ec21b","Type":"ContainerDied","Data":"40d299159e87f29a6a7e1c52a9e8d2f2733e3bb33ce23f6b1515f7d41477be8f"} Feb 19 11:12:33 crc kubenswrapper[4873]: I0219 11:12:33.995723 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/crc-debug-6mcmb" event={"ID":"b446129e-ed0f-4d69-b8e6-4080c69ec21b","Type":"ContainerStarted","Data":"e3d5e07c91a322a359e34accd84aed9f05e4c4c951146784683e31d5d331c1e8"} Feb 19 11:12:34 crc kubenswrapper[4873]: I0219 11:12:34.037183 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lms9s/crc-debug-6mcmb"] Feb 19 11:12:34 crc kubenswrapper[4873]: I0219 11:12:34.052408 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lms9s/crc-debug-6mcmb"] Feb 19 11:12:35 crc kubenswrapper[4873]: I0219 11:12:35.511357 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-6mcmb" Feb 19 11:12:35 crc kubenswrapper[4873]: I0219 11:12:35.596504 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b446129e-ed0f-4d69-b8e6-4080c69ec21b-host\") pod \"b446129e-ed0f-4d69-b8e6-4080c69ec21b\" (UID: \"b446129e-ed0f-4d69-b8e6-4080c69ec21b\") " Feb 19 11:12:35 crc kubenswrapper[4873]: I0219 11:12:35.596641 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b446129e-ed0f-4d69-b8e6-4080c69ec21b-host" (OuterVolumeSpecName: "host") pod "b446129e-ed0f-4d69-b8e6-4080c69ec21b" (UID: "b446129e-ed0f-4d69-b8e6-4080c69ec21b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 11:12:35 crc kubenswrapper[4873]: I0219 11:12:35.596686 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbbjg\" (UniqueName: \"kubernetes.io/projected/b446129e-ed0f-4d69-b8e6-4080c69ec21b-kube-api-access-cbbjg\") pod \"b446129e-ed0f-4d69-b8e6-4080c69ec21b\" (UID: \"b446129e-ed0f-4d69-b8e6-4080c69ec21b\") " Feb 19 11:12:35 crc kubenswrapper[4873]: I0219 11:12:35.599789 4873 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b446129e-ed0f-4d69-b8e6-4080c69ec21b-host\") on node \"crc\" DevicePath \"\"" Feb 19 11:12:35 crc kubenswrapper[4873]: I0219 11:12:35.618325 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b446129e-ed0f-4d69-b8e6-4080c69ec21b-kube-api-access-cbbjg" (OuterVolumeSpecName: "kube-api-access-cbbjg") pod "b446129e-ed0f-4d69-b8e6-4080c69ec21b" (UID: "b446129e-ed0f-4d69-b8e6-4080c69ec21b"). InnerVolumeSpecName "kube-api-access-cbbjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:12:35 crc kubenswrapper[4873]: I0219 11:12:35.701433 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbbjg\" (UniqueName: \"kubernetes.io/projected/b446129e-ed0f-4d69-b8e6-4080c69ec21b-kube-api-access-cbbjg\") on node \"crc\" DevicePath \"\"" Feb 19 11:12:36 crc kubenswrapper[4873]: I0219 11:12:36.015312 4873 scope.go:117] "RemoveContainer" containerID="40d299159e87f29a6a7e1c52a9e8d2f2733e3bb33ce23f6b1515f7d41477be8f" Feb 19 11:12:36 crc kubenswrapper[4873]: I0219 11:12:36.015442 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/crc-debug-6mcmb" Feb 19 11:12:36 crc kubenswrapper[4873]: I0219 11:12:36.484490 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:12:36 crc kubenswrapper[4873]: E0219 11:12:36.485253 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:12:37 crc kubenswrapper[4873]: I0219 11:12:37.522656 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b446129e-ed0f-4d69-b8e6-4080c69ec21b" path="/var/lib/kubelet/pods/b446129e-ed0f-4d69-b8e6-4080c69ec21b/volumes" Feb 19 11:12:47 crc kubenswrapper[4873]: I0219 11:12:47.485808 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:12:47 crc kubenswrapper[4873]: E0219 11:12:47.486611 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:12:59 crc kubenswrapper[4873]: I0219 11:12:59.484293 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:12:59 crc kubenswrapper[4873]: E0219 11:12:59.485118 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:13:10 crc kubenswrapper[4873]: I0219 11:13:10.353886 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c4d59d6dd-4nh9w_76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3/barbican-api/0.log" Feb 19 11:13:10 crc kubenswrapper[4873]: I0219 11:13:10.484471 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:13:10 crc kubenswrapper[4873]: E0219 11:13:10.484744 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:13:10 crc kubenswrapper[4873]: I0219 11:13:10.561058 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c4d59d6dd-4nh9w_76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3/barbican-api-log/0.log" Feb 19 11:13:10 crc kubenswrapper[4873]: I0219 11:13:10.588349 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-667444df98-tdgw9_9be5e1ee-a214-46ca-a5bf-d1d337848085/barbican-keystone-listener/0.log" Feb 19 11:13:10 crc kubenswrapper[4873]: I0219 11:13:10.737572 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-667444df98-tdgw9_9be5e1ee-a214-46ca-a5bf-d1d337848085/barbican-keystone-listener-log/0.log" Feb 19 11:13:10 crc kubenswrapper[4873]: I0219 11:13:10.826641 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-596d5556df-fx4q8_fc48b70c-5ab9-4765-a8cd-5985a3f63854/barbican-worker/0.log" Feb 19 11:13:10 crc kubenswrapper[4873]: I0219 11:13:10.856130 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-596d5556df-fx4q8_fc48b70c-5ab9-4765-a8cd-5985a3f63854/barbican-worker-log/0.log" Feb 19 11:13:11 crc kubenswrapper[4873]: I0219 11:13:11.094539 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r_fb8aa6eb-a92d-47ab-803f-664399242dde/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:11 crc kubenswrapper[4873]: I0219 11:13:11.407540 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e432fa6f-daf1-4f3a-9f84-ac9495956013/ceilometer-notification-agent/0.log" Feb 19 11:13:11 crc kubenswrapper[4873]: I0219 11:13:11.491965 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e432fa6f-daf1-4f3a-9f84-ac9495956013/ceilometer-central-agent/0.log" Feb 19 11:13:11 crc kubenswrapper[4873]: I0219 11:13:11.518389 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e432fa6f-daf1-4f3a-9f84-ac9495956013/proxy-httpd/0.log" Feb 19 11:13:11 crc kubenswrapper[4873]: I0219 11:13:11.613471 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e432fa6f-daf1-4f3a-9f84-ac9495956013/sg-core/0.log" Feb 19 11:13:11 crc kubenswrapper[4873]: I0219 11:13:11.861011 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f3dabe51-c676-42bb-936a-d784ee2e565a/cinder-api-log/0.log" Feb 19 11:13:12 crc kubenswrapper[4873]: I0219 11:13:12.205282 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_312e766d-4086-4bab-bf8f-9a154f1da5b5/probe/0.log" Feb 19 11:13:12 crc kubenswrapper[4873]: I0219 11:13:12.283094 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f3dabe51-c676-42bb-936a-d784ee2e565a/cinder-api/0.log" Feb 19 11:13:12 crc kubenswrapper[4873]: I0219 11:13:12.290501 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_312e766d-4086-4bab-bf8f-9a154f1da5b5/cinder-backup/0.log" Feb 19 11:13:12 crc kubenswrapper[4873]: I0219 11:13:12.488938 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1/cinder-scheduler/0.log" Feb 19 11:13:12 crc kubenswrapper[4873]: I0219 11:13:12.574321 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1/probe/0.log" Feb 19 11:13:12 crc kubenswrapper[4873]: I0219 11:13:12.728378 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_717b3122-e7c6-4cbe-8528-4b582dd7adc5/cinder-volume/0.log" Feb 19 11:13:12 crc kubenswrapper[4873]: I0219 11:13:12.803230 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_717b3122-e7c6-4cbe-8528-4b582dd7adc5/probe/0.log" Feb 19 11:13:12 crc kubenswrapper[4873]: I0219 11:13:12.982144 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_8268173a-e7be-4edd-a1e8-bed3486b138e/probe/0.log" Feb 19 11:13:13 crc kubenswrapper[4873]: I0219 11:13:13.158458 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_8268173a-e7be-4edd-a1e8-bed3486b138e/cinder-volume/0.log" Feb 19 11:13:13 crc kubenswrapper[4873]: I0219 11:13:13.261056 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-snp5b_f0739ccd-765a-42c4-89b4-de6adf188e24/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:13 crc kubenswrapper[4873]: I0219 11:13:13.365299 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2_40ec1f13-0b91-4c7c-a13e-11e60f55e627/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:13 crc kubenswrapper[4873]: I0219 11:13:13.501237 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c564b89cf-9v87f_20253d93-eafe-45db-b11e-338714ffd978/init/0.log" Feb 19 11:13:13 crc kubenswrapper[4873]: I0219 11:13:13.746379 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c564b89cf-9v87f_20253d93-eafe-45db-b11e-338714ffd978/init/0.log" Feb 19 11:13:13 crc kubenswrapper[4873]: I0219 11:13:13.879215 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj_ab7d5a49-ac61-4963-8766-1716098f3d4c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:13 crc kubenswrapper[4873]: I0219 11:13:13.912041 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c564b89cf-9v87f_20253d93-eafe-45db-b11e-338714ffd978/dnsmasq-dns/0.log" Feb 19 11:13:14 crc kubenswrapper[4873]: I0219 11:13:14.094090 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_09cfd898-398f-41ae-8c45-1ed215b69683/glance-httpd/0.log" Feb 19 11:13:14 crc kubenswrapper[4873]: I0219 11:13:14.157323 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_09cfd898-398f-41ae-8c45-1ed215b69683/glance-log/0.log" Feb 19 11:13:14 crc kubenswrapper[4873]: I0219 11:13:14.332113 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0df7963-e78f-457c-a27f-45c26232cfa7/glance-log/0.log" Feb 19 11:13:14 crc kubenswrapper[4873]: I0219 11:13:14.350643 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0df7963-e78f-457c-a27f-45c26232cfa7/glance-httpd/0.log" Feb 19 11:13:14 crc kubenswrapper[4873]: I0219 11:13:14.683565 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6687d9896d-v96j2_fa527f64-6e38-48c2-9927-a319f4579070/horizon/0.log" Feb 19 11:13:14 crc kubenswrapper[4873]: I0219 11:13:14.734810 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6_537c2ac8-0912-4609-ab4e-760060a78d52/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:15 crc kubenswrapper[4873]: I0219 11:13:15.061938 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-s2jwj_4b127e45-b09c-4e11-9423-58f1f51effd4/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:15 crc kubenswrapper[4873]: I0219 11:13:15.350707 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6687d9896d-v96j2_fa527f64-6e38-48c2-9927-a319f4579070/horizon-log/0.log" Feb 19 11:13:15 crc kubenswrapper[4873]: I0219 11:13:15.618347 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5fcd445c48-xvpw4_ed86f09e-909d-451b-96c0-9b4b7b27eb03/keystone-api/0.log" Feb 19 11:13:15 crc kubenswrapper[4873]: I0219 11:13:15.650871 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29524981-pxsmx_3f08f0c4-870d-4d9a-8a82-ce22827ce779/keystone-cron/0.log" Feb 19 11:13:15 crc kubenswrapper[4873]: I0219 11:13:15.696633 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_84c63c73-45f3-4d27-a3a3-cbfecd9e1810/kube-state-metrics/0.log" Feb 19 11:13:15 crc kubenswrapper[4873]: I0219 11:13:15.933957 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf_2baa296e-fb37-4d90-a7e4-68f61006e085/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:16 crc kubenswrapper[4873]: I0219 11:13:16.590435 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb_a607f592-ebca-4bf5-9e98-f9e2bc131ff1/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:16 crc kubenswrapper[4873]: I0219 11:13:16.814883 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76cc4fb9fc-vdfd4_f168d086-aaa7-4a6e-9a65-5ab28e10a7e8/neutron-httpd/0.log" Feb 19 11:13:16 crc kubenswrapper[4873]: I0219 11:13:16.940343 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76cc4fb9fc-vdfd4_f168d086-aaa7-4a6e-9a65-5ab28e10a7e8/neutron-api/0.log" Feb 19 11:13:17 crc kubenswrapper[4873]: I0219 11:13:17.059810 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_da89f0ff-c51c-4c4a-8df4-f7787d29ddd2/setup-container/0.log" Feb 19 11:13:17 crc kubenswrapper[4873]: I0219 11:13:17.133552 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_da89f0ff-c51c-4c4a-8df4-f7787d29ddd2/setup-container/0.log" Feb 19 11:13:17 crc kubenswrapper[4873]: I0219 11:13:17.297631 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_da89f0ff-c51c-4c4a-8df4-f7787d29ddd2/rabbitmq/0.log" Feb 19 11:13:17 crc kubenswrapper[4873]: I0219 11:13:17.902163 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c25b9f1f-0533-4e00-a926-08639b1b2266/nova-cell0-conductor-conductor/0.log" Feb 19 11:13:18 crc kubenswrapper[4873]: I0219 11:13:18.523218 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0688136a-f0b5-4a2a-8f08-9c99d9c3644c/nova-cell1-conductor-conductor/0.log" Feb 19 11:13:18 crc kubenswrapper[4873]: I0219 11:13:18.937379 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_cf46452a-f49d-48ab-a235-9e96f89c931f/nova-cell1-novncproxy-novncproxy/0.log" Feb 19 11:13:19 crc kubenswrapper[4873]: I0219 11:13:19.036953 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4f4e613e-0a31-4191-9afb-4fd0300586f9/nova-api-log/0.log" Feb 19 11:13:19 crc kubenswrapper[4873]: I0219 11:13:19.248666 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-v25t6_ce5f426d-554a-469a-be1e-e3e1b9bfa68e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:19 crc kubenswrapper[4873]: I0219 11:13:19.381518 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15cbab3c-9843-4bf6-b0e8-b65dec1e5112/nova-metadata-log/0.log" Feb 19 11:13:19 crc kubenswrapper[4873]: I0219 11:13:19.486545 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4f4e613e-0a31-4191-9afb-4fd0300586f9/nova-api-api/0.log" Feb 19 11:13:20 crc kubenswrapper[4873]: I0219 11:13:20.086219 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3385c22-baa0-4261-b498-6a09c8768520/mysql-bootstrap/0.log" Feb 19 11:13:20 crc kubenswrapper[4873]: I0219 11:13:20.231221 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_21bb5d7d-6565-484a-af2d-0edcff2729b3/memcached/0.log" Feb 19 11:13:20 crc kubenswrapper[4873]: I0219 11:13:20.237804 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_adb0395e-00f8-4bc6-a0a6-2b956235c58c/nova-scheduler-scheduler/0.log" Feb 19 11:13:20 crc kubenswrapper[4873]: I0219 11:13:20.306932 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3385c22-baa0-4261-b498-6a09c8768520/galera/0.log" Feb 19 11:13:20 crc kubenswrapper[4873]: I0219 11:13:20.369594 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3385c22-baa0-4261-b498-6a09c8768520/mysql-bootstrap/0.log" Feb 19 11:13:20 crc kubenswrapper[4873]: I0219 11:13:20.527751 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964/mysql-bootstrap/0.log" Feb 19 11:13:20 crc kubenswrapper[4873]: I0219 11:13:20.846286 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964/mysql-bootstrap/0.log" Feb 19 11:13:20 crc kubenswrapper[4873]: I0219 11:13:20.868146 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964/galera/0.log" Feb 19 11:13:20 crc kubenswrapper[4873]: I0219 11:13:20.888674 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5c4eb2b5-d272-49ff-938e-3e3359d29f46/openstackclient/0.log" Feb 19 11:13:21 crc kubenswrapper[4873]: I0219 11:13:21.082948 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-djxfb_888c3336-cd8a-4bf2-805f-6b473fb272f4/openstack-network-exporter/0.log" Feb 19 11:13:21 crc kubenswrapper[4873]: I0219 11:13:21.198684 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5bgp_de2f2331-fc83-420b-9e1b-fe08998cb0ab/ovsdb-server-init/0.log" Feb 19 11:13:21 crc kubenswrapper[4873]: I0219 11:13:21.351979 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15cbab3c-9843-4bf6-b0e8-b65dec1e5112/nova-metadata-metadata/0.log" Feb 19 11:13:21 crc kubenswrapper[4873]: I0219 11:13:21.427615 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5bgp_de2f2331-fc83-420b-9e1b-fe08998cb0ab/ovsdb-server-init/0.log" Feb 19 11:13:21 crc kubenswrapper[4873]: I0219 11:13:21.451415 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5bgp_de2f2331-fc83-420b-9e1b-fe08998cb0ab/ovsdb-server/0.log" Feb 19 11:13:21 crc kubenswrapper[4873]: I0219 11:13:21.644253 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5bgp_de2f2331-fc83-420b-9e1b-fe08998cb0ab/ovs-vswitchd/0.log" Feb 19 11:13:21 crc kubenswrapper[4873]: I0219 11:13:21.649481 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vsnt5_b0ab9d21-0c11-4940-ad43-3e20c46012ad/ovn-controller/0.log" Feb 19 11:13:21 crc kubenswrapper[4873]: I0219 11:13:21.757803 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-dks5c_f5d576b5-56dd-4f9f-b67b-0ee87213ea78/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:21 crc kubenswrapper[4873]: I0219 11:13:21.863636 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bd6df8e5-8bc5-4bd5-b466-a90642932cc2/openstack-network-exporter/0.log" Feb 19 11:13:21 crc kubenswrapper[4873]: I0219 11:13:21.871826 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bd6df8e5-8bc5-4bd5-b466-a90642932cc2/ovn-northd/0.log" Feb 19 11:13:21 crc kubenswrapper[4873]: I0219 11:13:21.994496 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4574f6e3-d697-424c-a9f1-7b74afb82324/openstack-network-exporter/0.log" Feb 19 11:13:22 crc kubenswrapper[4873]: I0219 11:13:22.082375 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4574f6e3-d697-424c-a9f1-7b74afb82324/ovsdbserver-nb/0.log" Feb 19 11:13:22 crc kubenswrapper[4873]: I0219 11:13:22.090538 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_877efa5f-4357-4396-8805-729237cd4e8f/openstack-network-exporter/0.log" Feb 19 11:13:22 crc kubenswrapper[4873]: I0219 11:13:22.217738 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_877efa5f-4357-4396-8805-729237cd4e8f/ovsdbserver-sb/0.log" Feb 19 11:13:22 crc kubenswrapper[4873]: I0219 11:13:22.475984 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6696d67b98-wrvnm_c5d4dde9-793b-403e-8701-84cca6a509e1/placement-api/0.log" Feb 19 11:13:22 crc kubenswrapper[4873]: I0219 11:13:22.516399 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ae630a8f-ee42-4f96-adb9-d18bf713af37/init-config-reloader/0.log" Feb 19 11:13:22 crc kubenswrapper[4873]: I0219 11:13:22.539711 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6696d67b98-wrvnm_c5d4dde9-793b-403e-8701-84cca6a509e1/placement-log/0.log" Feb 19 11:13:22 crc kubenswrapper[4873]: I0219 11:13:22.763565 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ae630a8f-ee42-4f96-adb9-d18bf713af37/init-config-reloader/0.log" Feb 19 11:13:22 crc kubenswrapper[4873]: I0219 11:13:22.785753 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ae630a8f-ee42-4f96-adb9-d18bf713af37/config-reloader/0.log" Feb 19 11:13:22 crc kubenswrapper[4873]: I0219 11:13:22.810381 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ae630a8f-ee42-4f96-adb9-d18bf713af37/thanos-sidecar/0.log" Feb 19 11:13:22 crc kubenswrapper[4873]: I0219 11:13:22.812629 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ae630a8f-ee42-4f96-adb9-d18bf713af37/prometheus/0.log" Feb 19 11:13:22 crc kubenswrapper[4873]: I0219 11:13:22.966479 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1150426f-909f-4b05-b216-ccf29f7039eb/setup-container/0.log" Feb 19 11:13:23 crc kubenswrapper[4873]: I0219 11:13:23.140410 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1150426f-909f-4b05-b216-ccf29f7039eb/setup-container/0.log" Feb 19 11:13:23 crc kubenswrapper[4873]: I0219 11:13:23.208259 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d564a6d4-4702-4e96-b814-8d9f01db02e5/setup-container/0.log" Feb 19 11:13:23 crc kubenswrapper[4873]: I0219 11:13:23.227085 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1150426f-909f-4b05-b216-ccf29f7039eb/rabbitmq/0.log" Feb 19 11:13:23 crc kubenswrapper[4873]: I0219 11:13:23.414350 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d564a6d4-4702-4e96-b814-8d9f01db02e5/setup-container/0.log" Feb 19 11:13:23 crc kubenswrapper[4873]: I0219 11:13:23.484705 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj_157ee933-b692-4c92-bcbd-967bc1cd377c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:23 crc kubenswrapper[4873]: I0219 11:13:23.528638 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d564a6d4-4702-4e96-b814-8d9f01db02e5/rabbitmq/0.log" Feb 19 11:13:23 crc kubenswrapper[4873]: I0219 11:13:23.908227 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn_fda37ba3-82f5-4d49-a15f-4dca53649ec7/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:23 crc kubenswrapper[4873]: I0219 11:13:23.920080 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-mt2n6_3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.060201 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5wvjf_7843f72c-5559-44d6-86e0-62f013e0a073/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.130410 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-sw8hj_15999617-f2b4-4a3f-911d-422db799fa37/ssh-known-hosts-edpm-deployment/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.343584 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7c6d694569-qbpxm_d51beb70-e455-4e75-9e06-863b41fbf9a8/proxy-server/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.416932 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7c6d694569-qbpxm_d51beb70-e455-4e75-9e06-863b41fbf9a8/proxy-httpd/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.485866 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:13:24 crc kubenswrapper[4873]: E0219 11:13:24.486626 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.545349 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mx6qq_91fbca18-847d-4e7b-8a40-e52dd348d155/swift-ring-rebalance/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.592383 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/account-auditor/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.709075 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/account-reaper/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.758000 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/account-replicator/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.799372 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/account-server/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.824496 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/container-auditor/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.910203 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/container-replicator/0.log" Feb 19 11:13:24 crc kubenswrapper[4873]: I0219 11:13:24.930754 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/container-server/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.030030 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/container-updater/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.122775 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/object-expirer/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.128243 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/object-auditor/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.240074 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/object-server/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.252343 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/object-replicator/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.273402 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/object-updater/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.357028 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/swift-recon-cron/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.358193 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/rsync/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.530080 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz_bf143721-2963-4009-8e23-0c283b4a88a3/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.618926 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5e5a79da-a068-4a68-ba79-6719ea0fb353/tempest-tests-tempest-tests-runner/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.752338 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_58738a83-0734-4889-9b0e-650e43f6dbb7/test-operator-logs-container/0.log" Feb 19 11:13:25 crc kubenswrapper[4873]: I0219 11:13:25.819171 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh_28f40398-582f-40ed-92b8-2ff5a19d138d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:13:26 crc kubenswrapper[4873]: I0219 11:13:26.456858 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_3d0e231c-7848-4f57-a28b-dfec3c87b617/watcher-applier/0.log" Feb 19 11:13:26 crc kubenswrapper[4873]: I0219 11:13:26.996683 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_9fb835f9-7ac4-4212-a372-b793c2fb8afd/watcher-api-log/0.log" Feb 19 11:13:29 crc kubenswrapper[4873]: I0219 11:13:29.785515 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_3ecf8671-28f5-4549-a4c1-0cdad8421837/watcher-decision-engine/0.log" Feb 19 11:13:30 crc kubenswrapper[4873]: I0219 11:13:30.370157 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_9fb835f9-7ac4-4212-a372-b793c2fb8afd/watcher-api/0.log" Feb 19 11:13:38 crc kubenswrapper[4873]: I0219 11:13:38.484898 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:13:38 crc kubenswrapper[4873]: E0219 11:13:38.486705 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:13:52 crc kubenswrapper[4873]: I0219 11:13:52.484133 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:13:52 crc kubenswrapper[4873]: I0219 11:13:52.790797 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"b1849d703253da95bfe5a3436c40938b54212f4d26fc335188390020db850543"} Feb 19 11:13:56 crc kubenswrapper[4873]: I0219 11:13:56.557346 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/util/0.log" Feb 19 11:13:56 crc kubenswrapper[4873]: I0219 11:13:56.720025 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/util/0.log" Feb 19 11:13:56 crc kubenswrapper[4873]: I0219 11:13:56.751830 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/pull/0.log" Feb 19 11:13:56 crc kubenswrapper[4873]: I0219 11:13:56.799458 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/pull/0.log" Feb 19 11:13:56 crc kubenswrapper[4873]: I0219 11:13:56.994591 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/pull/0.log" Feb 19 11:13:57 crc kubenswrapper[4873]: I0219 11:13:57.032841 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/util/0.log" Feb 19 11:13:57 crc kubenswrapper[4873]: I0219 11:13:57.050161 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/extract/0.log" Feb 19 11:13:57 crc kubenswrapper[4873]: I0219 11:13:57.475668 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-t54x9_f108f6ea-4506-48bf-b948-e367078c3dce/manager/0.log" Feb 19 11:13:57 crc kubenswrapper[4873]: I0219 11:13:57.819333 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-vgxsl_43531003-74d3-43b9-b0f5-6fca42b21975/manager/0.log" Feb 19 11:13:58 crc kubenswrapper[4873]: I0219 11:13:58.419711 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-vwx5n_8d4b6c84-e5ed-4761-b7c7-95b21da856f7/manager/0.log" Feb 19 11:13:58 crc kubenswrapper[4873]: I0219 11:13:58.708397 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-r9b5b_2b1c8872-b310-4994-819c-a8e472d8e522/manager/0.log" Feb 19 11:13:59 crc kubenswrapper[4873]: I0219 11:13:59.223016 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-f86jr_aeccf47e-b953-4036-b271-be284b9ab385/manager/0.log" Feb 19 11:13:59 crc kubenswrapper[4873]: I0219 11:13:59.660269 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-4t46s_3ff0155f-08fd-42f5-9b31-c3b9a7cefefe/manager/0.log" Feb 19 11:13:59 crc kubenswrapper[4873]: I0219 11:13:59.752862 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-cx7xf_2e7ca3f2-f73b-4bac-93bb-68b2518d956e/manager/0.log" Feb 19 11:14:00 crc kubenswrapper[4873]: I0219 11:14:00.489012 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-t2hfl_e4172fa9-b04e-4894-82d6-ec65ea92b004/manager/0.log" Feb 19 11:14:00 crc kubenswrapper[4873]: I0219 11:14:00.499267 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-t7mwr_ecf3484a-026e-4655-bfa8-e5292e2f62c5/manager/0.log" Feb 19 11:14:00 crc kubenswrapper[4873]: I0219 11:14:00.735585 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-8v7q6_588098b3-662f-4f6f-914c-8cb28e055ccd/manager/0.log" Feb 19 11:14:00 crc kubenswrapper[4873]: I0219 11:14:00.905480 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-d6h72_c471d099-fa02-4463-9eb9-9d0f6a3832e6/manager/0.log" Feb 19 11:14:01 crc kubenswrapper[4873]: I0219 11:14:01.026678 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-n6djt_8eec8859-f388-4d81-bbce-0433a66a1ef7/manager/0.log" Feb 19 11:14:01 crc kubenswrapper[4873]: I0219 11:14:01.283867 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv_515c6c0c-ae00-4ae1-ab3f-e22e5a585681/manager/0.log" Feb 19 11:14:01 crc kubenswrapper[4873]: I0219 11:14:01.564001 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-8476bb6847-rv4sx_e18b6851-e022-488e-bd95-27d1659f2761/operator/0.log" Feb 19 11:14:01 crc kubenswrapper[4873]: I0219 11:14:01.746087 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-p62rb_0144fe1c-ef13-4b4e-8cda-ddc72e2516bb/registry-server/0.log" Feb 19 11:14:02 crc kubenswrapper[4873]: I0219 11:14:02.052786 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-db4dr_dc53742c-7e71-49fa-9378-b26036c80275/manager/0.log" Feb 19 11:14:02 crc kubenswrapper[4873]: I0219 11:14:02.280613 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-6hpwv_74e9952e-50ef-4389-aa77-8f6e9cc790a8/manager/0.log" Feb 19 11:14:02 crc kubenswrapper[4873]: I0219 11:14:02.607997 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-lcnz4_9574bff7-0aac-4a24-b69f-135ff968422e/operator/0.log" Feb 19 11:14:02 crc kubenswrapper[4873]: I0219 11:14:02.964524 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-r74rt_1f098ace-bbc4-46ee-8e72-ab65a59851eb/manager/0.log" Feb 19 11:14:03 crc kubenswrapper[4873]: I0219 11:14:03.557665 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-2szzj_e139553a-a68d-424d-95b5-9093ea05440b/manager/0.log" Feb 19 11:14:03 crc kubenswrapper[4873]: I0219 11:14:03.608857 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-g22tc_0e9da99c-56ee-4353-9378-c59a2c4e1608/manager/0.log" Feb 19 11:14:03 crc kubenswrapper[4873]: I0219 11:14:03.969043 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7d767c64df-hld6w_e827e28d-ffd8-4f59-82bf-a6db1dab5413/manager/0.log" Feb 19 11:14:04 crc kubenswrapper[4873]: I0219 11:14:04.088590 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77c7c45f98-q8khx_26f0a6ea-18fb-411a-b193-83938a4bbe19/manager/0.log" Feb 19 11:14:04 crc kubenswrapper[4873]: I0219 11:14:04.248123 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-t9kgf_080befba-c501-4f84-8644-6b9fda0d8d5f/manager/0.log" Feb 19 11:14:09 crc kubenswrapper[4873]: I0219 11:14:09.876984 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-hqmvw_d53d2bae-fcdd-408c-9950-440e841cc035/manager/0.log" Feb 19 11:14:12 crc kubenswrapper[4873]: I0219 11:14:12.899651 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9rg97"] Feb 19 11:14:12 crc kubenswrapper[4873]: E0219 11:14:12.900786 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b446129e-ed0f-4d69-b8e6-4080c69ec21b" containerName="container-00" Feb 19 11:14:12 crc kubenswrapper[4873]: I0219 11:14:12.900803 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="b446129e-ed0f-4d69-b8e6-4080c69ec21b" containerName="container-00" Feb 19 11:14:12 crc kubenswrapper[4873]: I0219 11:14:12.901043 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="b446129e-ed0f-4d69-b8e6-4080c69ec21b" containerName="container-00" Feb 19 11:14:12 crc kubenswrapper[4873]: I0219 11:14:12.909544 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:12 crc kubenswrapper[4873]: I0219 11:14:12.915361 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rg97"] Feb 19 11:14:13 crc kubenswrapper[4873]: I0219 11:14:13.061496 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzbjg\" (UniqueName: \"kubernetes.io/projected/78dc4897-272b-47c0-a914-f67474646b69-kube-api-access-qzbjg\") pod \"redhat-marketplace-9rg97\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:13 crc kubenswrapper[4873]: I0219 11:14:13.061689 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-catalog-content\") pod \"redhat-marketplace-9rg97\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:13 crc kubenswrapper[4873]: I0219 11:14:13.061725 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-utilities\") pod \"redhat-marketplace-9rg97\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:13 crc kubenswrapper[4873]: I0219 11:14:13.163927 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-catalog-content\") pod \"redhat-marketplace-9rg97\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:13 crc kubenswrapper[4873]: I0219 11:14:13.164251 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-utilities\") pod \"redhat-marketplace-9rg97\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:13 crc kubenswrapper[4873]: I0219 11:14:13.164517 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-catalog-content\") pod \"redhat-marketplace-9rg97\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:13 crc kubenswrapper[4873]: I0219 11:14:13.164647 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-utilities\") pod \"redhat-marketplace-9rg97\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:13 crc kubenswrapper[4873]: I0219 11:14:13.164839 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzbjg\" (UniqueName: \"kubernetes.io/projected/78dc4897-272b-47c0-a914-f67474646b69-kube-api-access-qzbjg\") pod \"redhat-marketplace-9rg97\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:13 crc kubenswrapper[4873]: I0219 11:14:13.186837 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzbjg\" (UniqueName: \"kubernetes.io/projected/78dc4897-272b-47c0-a914-f67474646b69-kube-api-access-qzbjg\") pod \"redhat-marketplace-9rg97\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:13 crc kubenswrapper[4873]: I0219 11:14:13.235760 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:13 crc kubenswrapper[4873]: W0219 11:14:13.927842 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78dc4897_272b_47c0_a914_f67474646b69.slice/crio-bba085bf06b9aa2ef5767b63836e8944be40006b10c41e1a9e0f78dc8336e33d WatchSource:0}: Error finding container bba085bf06b9aa2ef5767b63836e8944be40006b10c41e1a9e0f78dc8336e33d: Status 404 returned error can't find the container with id bba085bf06b9aa2ef5767b63836e8944be40006b10c41e1a9e0f78dc8336e33d Feb 19 11:14:13 crc kubenswrapper[4873]: I0219 11:14:13.941843 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rg97"] Feb 19 11:14:14 crc kubenswrapper[4873]: I0219 11:14:14.021888 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rg97" event={"ID":"78dc4897-272b-47c0-a914-f67474646b69","Type":"ContainerStarted","Data":"bba085bf06b9aa2ef5767b63836e8944be40006b10c41e1a9e0f78dc8336e33d"} Feb 19 11:14:15 crc kubenswrapper[4873]: I0219 11:14:15.033722 4873 generic.go:334] "Generic (PLEG): container finished" podID="78dc4897-272b-47c0-a914-f67474646b69" containerID="b30980f32b2645748e6c10b7a5e9191a7ed287e91ec99d6202bbad12353be541" exitCode=0 Feb 19 11:14:15 crc kubenswrapper[4873]: I0219 11:14:15.034026 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rg97" event={"ID":"78dc4897-272b-47c0-a914-f67474646b69","Type":"ContainerDied","Data":"b30980f32b2645748e6c10b7a5e9191a7ed287e91ec99d6202bbad12353be541"} Feb 19 11:14:16 crc kubenswrapper[4873]: I0219 11:14:16.042997 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rg97" event={"ID":"78dc4897-272b-47c0-a914-f67474646b69","Type":"ContainerStarted","Data":"981c3b34f5f254eaaf4de449435114b331f5b9ed7452571e8eef21054d499734"} Feb 19 11:14:17 crc kubenswrapper[4873]: I0219 11:14:17.055309 4873 generic.go:334] "Generic (PLEG): container finished" podID="78dc4897-272b-47c0-a914-f67474646b69" containerID="981c3b34f5f254eaaf4de449435114b331f5b9ed7452571e8eef21054d499734" exitCode=0 Feb 19 11:14:17 crc kubenswrapper[4873]: I0219 11:14:17.055398 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rg97" event={"ID":"78dc4897-272b-47c0-a914-f67474646b69","Type":"ContainerDied","Data":"981c3b34f5f254eaaf4de449435114b331f5b9ed7452571e8eef21054d499734"} Feb 19 11:14:18 crc kubenswrapper[4873]: I0219 11:14:18.066901 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rg97" event={"ID":"78dc4897-272b-47c0-a914-f67474646b69","Type":"ContainerStarted","Data":"97f52c7b6a3d35760dce1e3eea42c096bc9ea72133b589b617ac83afe8756696"} Feb 19 11:14:18 crc kubenswrapper[4873]: I0219 11:14:18.110134 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9rg97" podStartSLOduration=3.511158011 podStartE2EDuration="6.110114632s" podCreationTimestamp="2026-02-19 11:14:12 +0000 UTC" firstStartedPulling="2026-02-19 11:14:15.036551567 +0000 UTC m=+5364.325983205" lastFinishedPulling="2026-02-19 11:14:17.635508188 +0000 UTC m=+5366.924939826" observedRunningTime="2026-02-19 11:14:18.095222038 +0000 UTC m=+5367.384653686" watchObservedRunningTime="2026-02-19 11:14:18.110114632 +0000 UTC m=+5367.399546270" Feb 19 11:14:23 crc kubenswrapper[4873]: I0219 11:14:23.236589 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:23 crc kubenswrapper[4873]: I0219 11:14:23.237317 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:23 crc kubenswrapper[4873]: I0219 11:14:23.285407 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:24 crc kubenswrapper[4873]: I0219 11:14:24.174692 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:24 crc kubenswrapper[4873]: I0219 11:14:24.233715 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rg97"] Feb 19 11:14:26 crc kubenswrapper[4873]: I0219 11:14:26.140302 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9rg97" podUID="78dc4897-272b-47c0-a914-f67474646b69" containerName="registry-server" containerID="cri-o://97f52c7b6a3d35760dce1e3eea42c096bc9ea72133b589b617ac83afe8756696" gracePeriod=2 Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.150772 4873 generic.go:334] "Generic (PLEG): container finished" podID="78dc4897-272b-47c0-a914-f67474646b69" containerID="97f52c7b6a3d35760dce1e3eea42c096bc9ea72133b589b617ac83afe8756696" exitCode=0 Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.150817 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rg97" event={"ID":"78dc4897-272b-47c0-a914-f67474646b69","Type":"ContainerDied","Data":"97f52c7b6a3d35760dce1e3eea42c096bc9ea72133b589b617ac83afe8756696"} Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.151377 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rg97" event={"ID":"78dc4897-272b-47c0-a914-f67474646b69","Type":"ContainerDied","Data":"bba085bf06b9aa2ef5767b63836e8944be40006b10c41e1a9e0f78dc8336e33d"} Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.151394 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba085bf06b9aa2ef5767b63836e8944be40006b10c41e1a9e0f78dc8336e33d" Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.237080 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.384072 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-utilities\") pod \"78dc4897-272b-47c0-a914-f67474646b69\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.384145 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzbjg\" (UniqueName: \"kubernetes.io/projected/78dc4897-272b-47c0-a914-f67474646b69-kube-api-access-qzbjg\") pod \"78dc4897-272b-47c0-a914-f67474646b69\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.384257 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-catalog-content\") pod \"78dc4897-272b-47c0-a914-f67474646b69\" (UID: \"78dc4897-272b-47c0-a914-f67474646b69\") " Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.385145 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-utilities" (OuterVolumeSpecName: "utilities") pod "78dc4897-272b-47c0-a914-f67474646b69" (UID: "78dc4897-272b-47c0-a914-f67474646b69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.385806 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.398615 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78dc4897-272b-47c0-a914-f67474646b69-kube-api-access-qzbjg" (OuterVolumeSpecName: "kube-api-access-qzbjg") pod "78dc4897-272b-47c0-a914-f67474646b69" (UID: "78dc4897-272b-47c0-a914-f67474646b69"). InnerVolumeSpecName "kube-api-access-qzbjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.400671 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-s67xb_d639ff25-343e-4e7c-bd2e-f5fc533923f4/control-plane-machine-set-operator/0.log" Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.405835 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78dc4897-272b-47c0-a914-f67474646b69" (UID: "78dc4897-272b-47c0-a914-f67474646b69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.454703 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-k627b_df659e7d-39ab-41ee-8df5-08896976666c/kube-rbac-proxy/0.log" Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.487931 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzbjg\" (UniqueName: \"kubernetes.io/projected/78dc4897-272b-47c0-a914-f67474646b69-kube-api-access-qzbjg\") on node \"crc\" DevicePath \"\"" Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.487963 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78dc4897-272b-47c0-a914-f67474646b69-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:14:27 crc kubenswrapper[4873]: I0219 11:14:27.581317 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-k627b_df659e7d-39ab-41ee-8df5-08896976666c/machine-api-operator/0.log" Feb 19 11:14:28 crc kubenswrapper[4873]: I0219 11:14:28.167054 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rg97" Feb 19 11:14:28 crc kubenswrapper[4873]: I0219 11:14:28.195602 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rg97"] Feb 19 11:14:28 crc kubenswrapper[4873]: I0219 11:14:28.207852 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rg97"] Feb 19 11:14:29 crc kubenswrapper[4873]: I0219 11:14:29.500017 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78dc4897-272b-47c0-a914-f67474646b69" path="/var/lib/kubelet/pods/78dc4897-272b-47c0-a914-f67474646b69/volumes" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.341463 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9vrv5"] Feb 19 11:14:33 crc kubenswrapper[4873]: E0219 11:14:33.342255 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78dc4897-272b-47c0-a914-f67474646b69" containerName="registry-server" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.342266 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dc4897-272b-47c0-a914-f67474646b69" containerName="registry-server" Feb 19 11:14:33 crc kubenswrapper[4873]: E0219 11:14:33.342286 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78dc4897-272b-47c0-a914-f67474646b69" containerName="extract-utilities" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.342292 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dc4897-272b-47c0-a914-f67474646b69" containerName="extract-utilities" Feb 19 11:14:33 crc kubenswrapper[4873]: E0219 11:14:33.342313 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78dc4897-272b-47c0-a914-f67474646b69" containerName="extract-content" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.342346 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dc4897-272b-47c0-a914-f67474646b69" containerName="extract-content" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.342548 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="78dc4897-272b-47c0-a914-f67474646b69" containerName="registry-server" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.344217 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.404534 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9vrv5"] Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.413966 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-catalog-content\") pod \"redhat-operators-9vrv5\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.414144 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqbp6\" (UniqueName: \"kubernetes.io/projected/3aa970ec-fd76-4bab-a561-14756fefbdd1-kube-api-access-kqbp6\") pod \"redhat-operators-9vrv5\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.414278 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-utilities\") pod \"redhat-operators-9vrv5\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.516495 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-catalog-content\") pod \"redhat-operators-9vrv5\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.516619 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqbp6\" (UniqueName: \"kubernetes.io/projected/3aa970ec-fd76-4bab-a561-14756fefbdd1-kube-api-access-kqbp6\") pod \"redhat-operators-9vrv5\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.516671 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-utilities\") pod \"redhat-operators-9vrv5\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.517126 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-catalog-content\") pod \"redhat-operators-9vrv5\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.517231 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-utilities\") pod \"redhat-operators-9vrv5\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.543164 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqbp6\" (UniqueName: \"kubernetes.io/projected/3aa970ec-fd76-4bab-a561-14756fefbdd1-kube-api-access-kqbp6\") pod \"redhat-operators-9vrv5\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:33 crc kubenswrapper[4873]: I0219 11:14:33.699913 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:34 crc kubenswrapper[4873]: I0219 11:14:34.207585 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9vrv5"] Feb 19 11:14:35 crc kubenswrapper[4873]: I0219 11:14:35.226344 4873 generic.go:334] "Generic (PLEG): container finished" podID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerID="d87b5266055a760854f04cd5355a22ae3b05d93504a1ddb696329e751f4db248" exitCode=0 Feb 19 11:14:35 crc kubenswrapper[4873]: I0219 11:14:35.226403 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vrv5" event={"ID":"3aa970ec-fd76-4bab-a561-14756fefbdd1","Type":"ContainerDied","Data":"d87b5266055a760854f04cd5355a22ae3b05d93504a1ddb696329e751f4db248"} Feb 19 11:14:35 crc kubenswrapper[4873]: I0219 11:14:35.226448 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vrv5" event={"ID":"3aa970ec-fd76-4bab-a561-14756fefbdd1","Type":"ContainerStarted","Data":"dea05a1dfe955bca77ebea3dc2661c6c66f8b364f16daacd38376cbfa5a9555b"} Feb 19 11:14:36 crc kubenswrapper[4873]: I0219 11:14:36.237834 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vrv5" event={"ID":"3aa970ec-fd76-4bab-a561-14756fefbdd1","Type":"ContainerStarted","Data":"119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b"} Feb 19 11:14:41 crc kubenswrapper[4873]: I0219 11:14:41.303031 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-ckd42_51fc361b-11a5-480a-a5b9-0eb4b7670e83/cert-manager-controller/0.log" Feb 19 11:14:41 crc kubenswrapper[4873]: I0219 11:14:41.360766 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-zhqgv_084c90b4-3270-4f64-8c8c-1a96f05dc1fa/cert-manager-cainjector/0.log" Feb 19 11:14:41 crc kubenswrapper[4873]: I0219 11:14:41.447793 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-fhd9c_2eebe311-368b-45b4-9e74-7442221e3785/cert-manager-webhook/0.log" Feb 19 11:14:42 crc kubenswrapper[4873]: I0219 11:14:42.324871 4873 generic.go:334] "Generic (PLEG): container finished" podID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerID="119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b" exitCode=0 Feb 19 11:14:42 crc kubenswrapper[4873]: I0219 11:14:42.324990 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vrv5" event={"ID":"3aa970ec-fd76-4bab-a561-14756fefbdd1","Type":"ContainerDied","Data":"119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b"} Feb 19 11:14:43 crc kubenswrapper[4873]: I0219 11:14:43.335635 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vrv5" event={"ID":"3aa970ec-fd76-4bab-a561-14756fefbdd1","Type":"ContainerStarted","Data":"3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9"} Feb 19 11:14:43 crc kubenswrapper[4873]: I0219 11:14:43.700921 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:43 crc kubenswrapper[4873]: I0219 11:14:43.700988 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:14:44 crc kubenswrapper[4873]: I0219 11:14:44.782587 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9vrv5" podUID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerName="registry-server" probeResult="failure" output=< Feb 19 11:14:44 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 11:14:44 crc kubenswrapper[4873]: > Feb 19 11:14:54 crc kubenswrapper[4873]: I0219 11:14:54.749147 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9vrv5" podUID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerName="registry-server" probeResult="failure" output=< Feb 19 11:14:54 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 11:14:54 crc kubenswrapper[4873]: > Feb 19 11:14:54 crc kubenswrapper[4873]: I0219 11:14:54.778193 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-9cr2m_9b3c6348-1c17-4774-9739-7a1dd3021d81/nmstate-console-plugin/0.log" Feb 19 11:14:54 crc kubenswrapper[4873]: I0219 11:14:54.944559 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-75txf_62408ce4-73ce-4726-91c1-96f645c39dee/nmstate-handler/0.log" Feb 19 11:14:55 crc kubenswrapper[4873]: I0219 11:14:55.007477 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8jgss_3b960434-ef37-45ae-aa50-8d719c8e2df5/kube-rbac-proxy/0.log" Feb 19 11:14:55 crc kubenswrapper[4873]: I0219 11:14:55.094330 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8jgss_3b960434-ef37-45ae-aa50-8d719c8e2df5/nmstate-metrics/0.log" Feb 19 11:14:55 crc kubenswrapper[4873]: I0219 11:14:55.154454 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-qlgxw_f7f28c8a-4571-485c-96a2-fc1c5856e3ea/nmstate-operator/0.log" Feb 19 11:14:55 crc kubenswrapper[4873]: I0219 11:14:55.294711 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-nfh8w_7af074a2-c1f7-4253-8efc-065748e0452b/nmstate-webhook/0.log" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.156162 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9vrv5" podStartSLOduration=19.626669921 podStartE2EDuration="27.156139717s" podCreationTimestamp="2026-02-19 11:14:33 +0000 UTC" firstStartedPulling="2026-02-19 11:14:35.228698426 +0000 UTC m=+5384.518130064" lastFinishedPulling="2026-02-19 11:14:42.758168222 +0000 UTC m=+5392.047599860" observedRunningTime="2026-02-19 11:14:43.363361582 +0000 UTC m=+5392.652793220" watchObservedRunningTime="2026-02-19 11:15:00.156139717 +0000 UTC m=+5409.445571355" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.164001 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm"] Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.165604 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.167874 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.169602 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.179875 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm"] Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.233382 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b012f5e4-512b-4887-87c4-3b1d54b23599-secret-volume\") pod \"collect-profiles-29524995-hsrvm\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.233555 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s7v7\" (UniqueName: \"kubernetes.io/projected/b012f5e4-512b-4887-87c4-3b1d54b23599-kube-api-access-5s7v7\") pod \"collect-profiles-29524995-hsrvm\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.233673 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b012f5e4-512b-4887-87c4-3b1d54b23599-config-volume\") pod \"collect-profiles-29524995-hsrvm\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.335691 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s7v7\" (UniqueName: \"kubernetes.io/projected/b012f5e4-512b-4887-87c4-3b1d54b23599-kube-api-access-5s7v7\") pod \"collect-profiles-29524995-hsrvm\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.336170 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b012f5e4-512b-4887-87c4-3b1d54b23599-config-volume\") pod \"collect-profiles-29524995-hsrvm\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.336379 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b012f5e4-512b-4887-87c4-3b1d54b23599-secret-volume\") pod \"collect-profiles-29524995-hsrvm\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.337543 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b012f5e4-512b-4887-87c4-3b1d54b23599-config-volume\") pod \"collect-profiles-29524995-hsrvm\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.344780 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b012f5e4-512b-4887-87c4-3b1d54b23599-secret-volume\") pod \"collect-profiles-29524995-hsrvm\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.357413 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s7v7\" (UniqueName: \"kubernetes.io/projected/b012f5e4-512b-4887-87c4-3b1d54b23599-kube-api-access-5s7v7\") pod \"collect-profiles-29524995-hsrvm\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.493671 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:00 crc kubenswrapper[4873]: I0219 11:15:00.983243 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm"] Feb 19 11:15:01 crc kubenswrapper[4873]: I0219 11:15:01.503446 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" event={"ID":"b012f5e4-512b-4887-87c4-3b1d54b23599","Type":"ContainerStarted","Data":"71733d284d7a661e5788ad2a99d92730f3b60d0cc2e3069cd9a68e27bc610261"} Feb 19 11:15:01 crc kubenswrapper[4873]: I0219 11:15:01.503491 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" event={"ID":"b012f5e4-512b-4887-87c4-3b1d54b23599","Type":"ContainerStarted","Data":"667382952d4ebc7834562e45505740a4d0b603b4c0f3b51e63777f81a57296a9"} Feb 19 11:15:01 crc kubenswrapper[4873]: I0219 11:15:01.539828 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" podStartSLOduration=1.539809025 podStartE2EDuration="1.539809025s" podCreationTimestamp="2026-02-19 11:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 11:15:01.533864346 +0000 UTC m=+5410.823295984" watchObservedRunningTime="2026-02-19 11:15:01.539809025 +0000 UTC m=+5410.829240663" Feb 19 11:15:02 crc kubenswrapper[4873]: I0219 11:15:02.502969 4873 generic.go:334] "Generic (PLEG): container finished" podID="b012f5e4-512b-4887-87c4-3b1d54b23599" containerID="71733d284d7a661e5788ad2a99d92730f3b60d0cc2e3069cd9a68e27bc610261" exitCode=0 Feb 19 11:15:02 crc kubenswrapper[4873]: I0219 11:15:02.503013 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" event={"ID":"b012f5e4-512b-4887-87c4-3b1d54b23599","Type":"ContainerDied","Data":"71733d284d7a661e5788ad2a99d92730f3b60d0cc2e3069cd9a68e27bc610261"} Feb 19 11:15:03 crc kubenswrapper[4873]: I0219 11:15:03.761159 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:15:03 crc kubenswrapper[4873]: I0219 11:15:03.833542 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:15:03 crc kubenswrapper[4873]: I0219 11:15:03.899680 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.019081 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b012f5e4-512b-4887-87c4-3b1d54b23599-config-volume\") pod \"b012f5e4-512b-4887-87c4-3b1d54b23599\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.019279 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s7v7\" (UniqueName: \"kubernetes.io/projected/b012f5e4-512b-4887-87c4-3b1d54b23599-kube-api-access-5s7v7\") pod \"b012f5e4-512b-4887-87c4-3b1d54b23599\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.019313 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b012f5e4-512b-4887-87c4-3b1d54b23599-secret-volume\") pod \"b012f5e4-512b-4887-87c4-3b1d54b23599\" (UID: \"b012f5e4-512b-4887-87c4-3b1d54b23599\") " Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.019830 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b012f5e4-512b-4887-87c4-3b1d54b23599-config-volume" (OuterVolumeSpecName: "config-volume") pod "b012f5e4-512b-4887-87c4-3b1d54b23599" (UID: "b012f5e4-512b-4887-87c4-3b1d54b23599"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.027319 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b012f5e4-512b-4887-87c4-3b1d54b23599-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b012f5e4-512b-4887-87c4-3b1d54b23599" (UID: "b012f5e4-512b-4887-87c4-3b1d54b23599"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.027328 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b012f5e4-512b-4887-87c4-3b1d54b23599-kube-api-access-5s7v7" (OuterVolumeSpecName: "kube-api-access-5s7v7") pod "b012f5e4-512b-4887-87c4-3b1d54b23599" (UID: "b012f5e4-512b-4887-87c4-3b1d54b23599"). InnerVolumeSpecName "kube-api-access-5s7v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.121698 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s7v7\" (UniqueName: \"kubernetes.io/projected/b012f5e4-512b-4887-87c4-3b1d54b23599-kube-api-access-5s7v7\") on node \"crc\" DevicePath \"\"" Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.121731 4873 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b012f5e4-512b-4887-87c4-3b1d54b23599-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.121741 4873 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b012f5e4-512b-4887-87c4-3b1d54b23599-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.522964 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" event={"ID":"b012f5e4-512b-4887-87c4-3b1d54b23599","Type":"ContainerDied","Data":"667382952d4ebc7834562e45505740a4d0b603b4c0f3b51e63777f81a57296a9"} Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.523016 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="667382952d4ebc7834562e45505740a4d0b603b4c0f3b51e63777f81a57296a9" Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.523052 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524995-hsrvm" Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.543717 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9vrv5"] Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.599046 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k"] Feb 19 11:15:04 crc kubenswrapper[4873]: I0219 11:15:04.610176 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524950-pp89k"] Feb 19 11:15:05 crc kubenswrapper[4873]: I0219 11:15:05.505232 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e250d05-a293-4a3c-8658-99d1ae2dc894" path="/var/lib/kubelet/pods/9e250d05-a293-4a3c-8658-99d1ae2dc894/volumes" Feb 19 11:15:05 crc kubenswrapper[4873]: I0219 11:15:05.530570 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9vrv5" podUID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerName="registry-server" containerID="cri-o://3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9" gracePeriod=2 Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.219406 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.264814 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-utilities\") pod \"3aa970ec-fd76-4bab-a561-14756fefbdd1\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.265082 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqbp6\" (UniqueName: \"kubernetes.io/projected/3aa970ec-fd76-4bab-a561-14756fefbdd1-kube-api-access-kqbp6\") pod \"3aa970ec-fd76-4bab-a561-14756fefbdd1\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.265149 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-catalog-content\") pod \"3aa970ec-fd76-4bab-a561-14756fefbdd1\" (UID: \"3aa970ec-fd76-4bab-a561-14756fefbdd1\") " Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.265555 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-utilities" (OuterVolumeSpecName: "utilities") pod "3aa970ec-fd76-4bab-a561-14756fefbdd1" (UID: "3aa970ec-fd76-4bab-a561-14756fefbdd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.284684 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aa970ec-fd76-4bab-a561-14756fefbdd1-kube-api-access-kqbp6" (OuterVolumeSpecName: "kube-api-access-kqbp6") pod "3aa970ec-fd76-4bab-a561-14756fefbdd1" (UID: "3aa970ec-fd76-4bab-a561-14756fefbdd1"). InnerVolumeSpecName "kube-api-access-kqbp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.367812 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqbp6\" (UniqueName: \"kubernetes.io/projected/3aa970ec-fd76-4bab-a561-14756fefbdd1-kube-api-access-kqbp6\") on node \"crc\" DevicePath \"\"" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.367850 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.392254 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3aa970ec-fd76-4bab-a561-14756fefbdd1" (UID: "3aa970ec-fd76-4bab-a561-14756fefbdd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.469767 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa970ec-fd76-4bab-a561-14756fefbdd1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.540397 4873 generic.go:334] "Generic (PLEG): container finished" podID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerID="3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9" exitCode=0 Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.540437 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vrv5" event={"ID":"3aa970ec-fd76-4bab-a561-14756fefbdd1","Type":"ContainerDied","Data":"3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9"} Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.540462 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vrv5" event={"ID":"3aa970ec-fd76-4bab-a561-14756fefbdd1","Type":"ContainerDied","Data":"dea05a1dfe955bca77ebea3dc2661c6c66f8b364f16daacd38376cbfa5a9555b"} Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.540481 4873 scope.go:117] "RemoveContainer" containerID="3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.540490 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vrv5" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.561209 4873 scope.go:117] "RemoveContainer" containerID="119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.585174 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9vrv5"] Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.594029 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9vrv5"] Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.606039 4873 scope.go:117] "RemoveContainer" containerID="d87b5266055a760854f04cd5355a22ae3b05d93504a1ddb696329e751f4db248" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.658818 4873 scope.go:117] "RemoveContainer" containerID="3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9" Feb 19 11:15:06 crc kubenswrapper[4873]: E0219 11:15:06.659767 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9\": container with ID starting with 3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9 not found: ID does not exist" containerID="3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.659805 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9"} err="failed to get container status \"3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9\": rpc error: code = NotFound desc = could not find container \"3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9\": container with ID starting with 3876019e05c868470eef8302abe3da2e5159d53aa9c32b37cb60972c4e6520b9 not found: ID does not exist" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.659857 4873 scope.go:117] "RemoveContainer" containerID="119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b" Feb 19 11:15:06 crc kubenswrapper[4873]: E0219 11:15:06.660215 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b\": container with ID starting with 119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b not found: ID does not exist" containerID="119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.660241 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b"} err="failed to get container status \"119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b\": rpc error: code = NotFound desc = could not find container \"119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b\": container with ID starting with 119594e8ef0e26304e1384501bf3399b006a307679a417dd104d0e34ceace31b not found: ID does not exist" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.660259 4873 scope.go:117] "RemoveContainer" containerID="d87b5266055a760854f04cd5355a22ae3b05d93504a1ddb696329e751f4db248" Feb 19 11:15:06 crc kubenswrapper[4873]: E0219 11:15:06.660516 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d87b5266055a760854f04cd5355a22ae3b05d93504a1ddb696329e751f4db248\": container with ID starting with d87b5266055a760854f04cd5355a22ae3b05d93504a1ddb696329e751f4db248 not found: ID does not exist" containerID="d87b5266055a760854f04cd5355a22ae3b05d93504a1ddb696329e751f4db248" Feb 19 11:15:06 crc kubenswrapper[4873]: I0219 11:15:06.660545 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d87b5266055a760854f04cd5355a22ae3b05d93504a1ddb696329e751f4db248"} err="failed to get container status \"d87b5266055a760854f04cd5355a22ae3b05d93504a1ddb696329e751f4db248\": rpc error: code = NotFound desc = could not find container \"d87b5266055a760854f04cd5355a22ae3b05d93504a1ddb696329e751f4db248\": container with ID starting with d87b5266055a760854f04cd5355a22ae3b05d93504a1ddb696329e751f4db248 not found: ID does not exist" Feb 19 11:15:07 crc kubenswrapper[4873]: I0219 11:15:07.500067 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aa970ec-fd76-4bab-a561-14756fefbdd1" path="/var/lib/kubelet/pods/3aa970ec-fd76-4bab-a561-14756fefbdd1/volumes" Feb 19 11:15:08 crc kubenswrapper[4873]: I0219 11:15:08.858870 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-v7nww_5d79d4d8-e595-4aec-bc0b-7347b826c257/prometheus-operator/0.log" Feb 19 11:15:09 crc kubenswrapper[4873]: I0219 11:15:09.056379 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7_3180318c-7d9a-454b-8de4-887fabae362b/prometheus-operator-admission-webhook/0.log" Feb 19 11:15:09 crc kubenswrapper[4873]: I0219 11:15:09.084877 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7496f9f864-qptdb_4724c979-0040-4017-86ce-78d2a8bdb44e/prometheus-operator-admission-webhook/0.log" Feb 19 11:15:09 crc kubenswrapper[4873]: I0219 11:15:09.255080 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-7wtlv_b23281d2-935e-47c1-bc83-8d00c7649625/operator/0.log" Feb 19 11:15:09 crc kubenswrapper[4873]: I0219 11:15:09.302234 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-8sflg_ea1cc2c7-c932-4b3d-b718-d017eb06163f/perses-operator/0.log" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.827474 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w429f"] Feb 19 11:15:21 crc kubenswrapper[4873]: E0219 11:15:21.828582 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerName="extract-content" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.828598 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerName="extract-content" Feb 19 11:15:21 crc kubenswrapper[4873]: E0219 11:15:21.828634 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerName="extract-utilities" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.828640 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerName="extract-utilities" Feb 19 11:15:21 crc kubenswrapper[4873]: E0219 11:15:21.828651 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b012f5e4-512b-4887-87c4-3b1d54b23599" containerName="collect-profiles" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.828657 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="b012f5e4-512b-4887-87c4-3b1d54b23599" containerName="collect-profiles" Feb 19 11:15:21 crc kubenswrapper[4873]: E0219 11:15:21.828670 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerName="registry-server" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.828676 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerName="registry-server" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.828878 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aa970ec-fd76-4bab-a561-14756fefbdd1" containerName="registry-server" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.828894 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="b012f5e4-512b-4887-87c4-3b1d54b23599" containerName="collect-profiles" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.830352 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.836552 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w429f"] Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.892377 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-catalog-content\") pod \"community-operators-w429f\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.892421 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-utilities\") pod \"community-operators-w429f\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.892538 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfms\" (UniqueName: \"kubernetes.io/projected/f5b5347f-ec96-4e47-a667-286f7e382b01-kube-api-access-xwfms\") pod \"community-operators-w429f\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.994836 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwfms\" (UniqueName: \"kubernetes.io/projected/f5b5347f-ec96-4e47-a667-286f7e382b01-kube-api-access-xwfms\") pod \"community-operators-w429f\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.994981 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-catalog-content\") pod \"community-operators-w429f\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.995001 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-utilities\") pod \"community-operators-w429f\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.995502 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-catalog-content\") pod \"community-operators-w429f\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:21 crc kubenswrapper[4873]: I0219 11:15:21.995561 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-utilities\") pod \"community-operators-w429f\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:22 crc kubenswrapper[4873]: I0219 11:15:22.028074 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwfms\" (UniqueName: \"kubernetes.io/projected/f5b5347f-ec96-4e47-a667-286f7e382b01-kube-api-access-xwfms\") pod \"community-operators-w429f\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:22 crc kubenswrapper[4873]: I0219 11:15:22.198667 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:22 crc kubenswrapper[4873]: I0219 11:15:22.487509 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7t964_4a42b4a3-c207-40a8-80b9-0532a0ec2865/controller/0.log" Feb 19 11:15:22 crc kubenswrapper[4873]: I0219 11:15:22.526376 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7t964_4a42b4a3-c207-40a8-80b9-0532a0ec2865/kube-rbac-proxy/0.log" Feb 19 11:15:22 crc kubenswrapper[4873]: I0219 11:15:22.722976 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-frr-files/0.log" Feb 19 11:15:22 crc kubenswrapper[4873]: I0219 11:15:22.789665 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w429f"] Feb 19 11:15:22 crc kubenswrapper[4873]: I0219 11:15:22.983867 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-frr-files/0.log" Feb 19 11:15:22 crc kubenswrapper[4873]: I0219 11:15:22.992910 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-reloader/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.007200 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-metrics/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.035885 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-reloader/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.229560 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-frr-files/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.229597 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-reloader/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.264009 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-metrics/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.300788 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-metrics/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.483236 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-frr-files/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.484934 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-metrics/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.505216 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-reloader/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.542064 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/controller/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.664355 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/frr-metrics/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.697746 4873 generic.go:334] "Generic (PLEG): container finished" podID="f5b5347f-ec96-4e47-a667-286f7e382b01" containerID="d76c5be7fa7447fdf794eef4c34831d7fd615bae29cb424d67a2de54bab7268b" exitCode=0 Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.697795 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w429f" event={"ID":"f5b5347f-ec96-4e47-a667-286f7e382b01","Type":"ContainerDied","Data":"d76c5be7fa7447fdf794eef4c34831d7fd615bae29cb424d67a2de54bab7268b"} Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.697825 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w429f" event={"ID":"f5b5347f-ec96-4e47-a667-286f7e382b01","Type":"ContainerStarted","Data":"53cf5f5bcf7ba2b2a7e15e41ecbaff43df691ee310c04d83ecb5ae5c7ae80c8f"} Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.747477 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/kube-rbac-proxy-frr/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.780331 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/kube-rbac-proxy/0.log" Feb 19 11:15:23 crc kubenswrapper[4873]: I0219 11:15:23.980110 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/reloader/0.log" Feb 19 11:15:24 crc kubenswrapper[4873]: I0219 11:15:24.068264 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-xwr52_8d8f9aee-601f-4530-876b-83709311196b/frr-k8s-webhook-server/0.log" Feb 19 11:15:24 crc kubenswrapper[4873]: I0219 11:15:24.243999 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6897955989-f6tl8_94f344cf-0f09-4812-ab40-dcce7f260a53/manager/0.log" Feb 19 11:15:24 crc kubenswrapper[4873]: I0219 11:15:24.557798 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7bf7457c95-rq2ph_e9d29e18-f362-478f-911d-ed979e43aae1/webhook-server/0.log" Feb 19 11:15:24 crc kubenswrapper[4873]: I0219 11:15:24.666280 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-phsr6_46cac2a1-6c87-4c4e-a73f-92dbee290015/kube-rbac-proxy/0.log" Feb 19 11:15:24 crc kubenswrapper[4873]: I0219 11:15:24.710769 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w429f" event={"ID":"f5b5347f-ec96-4e47-a667-286f7e382b01","Type":"ContainerStarted","Data":"47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219"} Feb 19 11:15:25 crc kubenswrapper[4873]: I0219 11:15:25.482758 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-phsr6_46cac2a1-6c87-4c4e-a73f-92dbee290015/speaker/0.log" Feb 19 11:15:25 crc kubenswrapper[4873]: I0219 11:15:25.534202 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/frr/0.log" Feb 19 11:15:26 crc kubenswrapper[4873]: I0219 11:15:26.727515 4873 generic.go:334] "Generic (PLEG): container finished" podID="f5b5347f-ec96-4e47-a667-286f7e382b01" containerID="47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219" exitCode=0 Feb 19 11:15:26 crc kubenswrapper[4873]: I0219 11:15:26.727591 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w429f" event={"ID":"f5b5347f-ec96-4e47-a667-286f7e382b01","Type":"ContainerDied","Data":"47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219"} Feb 19 11:15:27 crc kubenswrapper[4873]: I0219 11:15:27.740995 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w429f" event={"ID":"f5b5347f-ec96-4e47-a667-286f7e382b01","Type":"ContainerStarted","Data":"f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15"} Feb 19 11:15:27 crc kubenswrapper[4873]: I0219 11:15:27.763596 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w429f" podStartSLOduration=3.255409032 podStartE2EDuration="6.763580988s" podCreationTimestamp="2026-02-19 11:15:21 +0000 UTC" firstStartedPulling="2026-02-19 11:15:23.701797019 +0000 UTC m=+5432.991228677" lastFinishedPulling="2026-02-19 11:15:27.209968995 +0000 UTC m=+5436.499400633" observedRunningTime="2026-02-19 11:15:27.761318501 +0000 UTC m=+5437.050750139" watchObservedRunningTime="2026-02-19 11:15:27.763580988 +0000 UTC m=+5437.053012626" Feb 19 11:15:32 crc kubenswrapper[4873]: I0219 11:15:32.199309 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:32 crc kubenswrapper[4873]: I0219 11:15:32.199962 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:32 crc kubenswrapper[4873]: I0219 11:15:32.249015 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:32 crc kubenswrapper[4873]: I0219 11:15:32.845257 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:32 crc kubenswrapper[4873]: I0219 11:15:32.897682 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w429f"] Feb 19 11:15:34 crc kubenswrapper[4873]: I0219 11:15:34.801461 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w429f" podUID="f5b5347f-ec96-4e47-a667-286f7e382b01" containerName="registry-server" containerID="cri-o://f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15" gracePeriod=2 Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.299408 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.468169 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-catalog-content\") pod \"f5b5347f-ec96-4e47-a667-286f7e382b01\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.468258 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwfms\" (UniqueName: \"kubernetes.io/projected/f5b5347f-ec96-4e47-a667-286f7e382b01-kube-api-access-xwfms\") pod \"f5b5347f-ec96-4e47-a667-286f7e382b01\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.468541 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-utilities\") pod \"f5b5347f-ec96-4e47-a667-286f7e382b01\" (UID: \"f5b5347f-ec96-4e47-a667-286f7e382b01\") " Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.469692 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-utilities" (OuterVolumeSpecName: "utilities") pod "f5b5347f-ec96-4e47-a667-286f7e382b01" (UID: "f5b5347f-ec96-4e47-a667-286f7e382b01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.474308 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b5347f-ec96-4e47-a667-286f7e382b01-kube-api-access-xwfms" (OuterVolumeSpecName: "kube-api-access-xwfms") pod "f5b5347f-ec96-4e47-a667-286f7e382b01" (UID: "f5b5347f-ec96-4e47-a667-286f7e382b01"). InnerVolumeSpecName "kube-api-access-xwfms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.572502 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.572554 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwfms\" (UniqueName: \"kubernetes.io/projected/f5b5347f-ec96-4e47-a667-286f7e382b01-kube-api-access-xwfms\") on node \"crc\" DevicePath \"\"" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.813155 4873 generic.go:334] "Generic (PLEG): container finished" podID="f5b5347f-ec96-4e47-a667-286f7e382b01" containerID="f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15" exitCode=0 Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.813240 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w429f" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.813262 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w429f" event={"ID":"f5b5347f-ec96-4e47-a667-286f7e382b01","Type":"ContainerDied","Data":"f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15"} Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.813650 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w429f" event={"ID":"f5b5347f-ec96-4e47-a667-286f7e382b01","Type":"ContainerDied","Data":"53cf5f5bcf7ba2b2a7e15e41ecbaff43df691ee310c04d83ecb5ae5c7ae80c8f"} Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.813675 4873 scope.go:117] "RemoveContainer" containerID="f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.843797 4873 scope.go:117] "RemoveContainer" containerID="47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.869474 4873 scope.go:117] "RemoveContainer" containerID="d76c5be7fa7447fdf794eef4c34831d7fd615bae29cb424d67a2de54bab7268b" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.931764 4873 scope.go:117] "RemoveContainer" containerID="f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15" Feb 19 11:15:35 crc kubenswrapper[4873]: E0219 11:15:35.932196 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15\": container with ID starting with f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15 not found: ID does not exist" containerID="f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.932232 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15"} err="failed to get container status \"f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15\": rpc error: code = NotFound desc = could not find container \"f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15\": container with ID starting with f8e747bd988653ca7f19917ec2b99fc1db607d63f2ebbd9eed9a7cbaa27a3c15 not found: ID does not exist" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.932258 4873 scope.go:117] "RemoveContainer" containerID="47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219" Feb 19 11:15:35 crc kubenswrapper[4873]: E0219 11:15:35.932511 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219\": container with ID starting with 47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219 not found: ID does not exist" containerID="47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.932538 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219"} err="failed to get container status \"47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219\": rpc error: code = NotFound desc = could not find container \"47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219\": container with ID starting with 47f4cc4a1e4fd76aa97fcb23bb085ce2414f8284a04c68faa6b249add2e91219 not found: ID does not exist" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.932584 4873 scope.go:117] "RemoveContainer" containerID="d76c5be7fa7447fdf794eef4c34831d7fd615bae29cb424d67a2de54bab7268b" Feb 19 11:15:35 crc kubenswrapper[4873]: E0219 11:15:35.932828 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76c5be7fa7447fdf794eef4c34831d7fd615bae29cb424d67a2de54bab7268b\": container with ID starting with d76c5be7fa7447fdf794eef4c34831d7fd615bae29cb424d67a2de54bab7268b not found: ID does not exist" containerID="d76c5be7fa7447fdf794eef4c34831d7fd615bae29cb424d67a2de54bab7268b" Feb 19 11:15:35 crc kubenswrapper[4873]: I0219 11:15:35.932855 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76c5be7fa7447fdf794eef4c34831d7fd615bae29cb424d67a2de54bab7268b"} err="failed to get container status \"d76c5be7fa7447fdf794eef4c34831d7fd615bae29cb424d67a2de54bab7268b\": rpc error: code = NotFound desc = could not find container \"d76c5be7fa7447fdf794eef4c34831d7fd615bae29cb424d67a2de54bab7268b\": container with ID starting with d76c5be7fa7447fdf794eef4c34831d7fd615bae29cb424d67a2de54bab7268b not found: ID does not exist" Feb 19 11:15:36 crc kubenswrapper[4873]: I0219 11:15:36.189322 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5b5347f-ec96-4e47-a667-286f7e382b01" (UID: "f5b5347f-ec96-4e47-a667-286f7e382b01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:15:36 crc kubenswrapper[4873]: I0219 11:15:36.285520 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5b5347f-ec96-4e47-a667-286f7e382b01-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:15:36 crc kubenswrapper[4873]: I0219 11:15:36.456165 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w429f"] Feb 19 11:15:36 crc kubenswrapper[4873]: I0219 11:15:36.469499 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w429f"] Feb 19 11:15:37 crc kubenswrapper[4873]: I0219 11:15:37.495316 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5b5347f-ec96-4e47-a667-286f7e382b01" path="/var/lib/kubelet/pods/f5b5347f-ec96-4e47-a667-286f7e382b01/volumes" Feb 19 11:15:37 crc kubenswrapper[4873]: I0219 11:15:37.889942 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/util/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.178858 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/util/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.180927 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/pull/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.210652 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/pull/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.365023 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/util/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.383526 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/extract/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.450419 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/pull/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.569092 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/util/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.724172 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/pull/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.735933 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/util/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.742228 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/pull/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.928749 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/extract/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.950487 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/util/0.log" Feb 19 11:15:38 crc kubenswrapper[4873]: I0219 11:15:38.971179 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/pull/0.log" Feb 19 11:15:39 crc kubenswrapper[4873]: I0219 11:15:39.100826 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-utilities/0.log" Feb 19 11:15:39 crc kubenswrapper[4873]: I0219 11:15:39.264627 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-utilities/0.log" Feb 19 11:15:39 crc kubenswrapper[4873]: I0219 11:15:39.300674 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-content/0.log" Feb 19 11:15:39 crc kubenswrapper[4873]: I0219 11:15:39.314609 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-content/0.log" Feb 19 11:15:39 crc kubenswrapper[4873]: I0219 11:15:39.467646 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-utilities/0.log" Feb 19 11:15:39 crc kubenswrapper[4873]: I0219 11:15:39.521354 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-content/0.log" Feb 19 11:15:39 crc kubenswrapper[4873]: I0219 11:15:39.751680 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-utilities/0.log" Feb 19 11:15:39 crc kubenswrapper[4873]: I0219 11:15:39.937655 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-utilities/0.log" Feb 19 11:15:39 crc kubenswrapper[4873]: I0219 11:15:39.954281 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/registry-server/0.log" Feb 19 11:15:40 crc kubenswrapper[4873]: I0219 11:15:40.028496 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-content/0.log" Feb 19 11:15:40 crc kubenswrapper[4873]: I0219 11:15:40.031307 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-content/0.log" Feb 19 11:15:40 crc kubenswrapper[4873]: I0219 11:15:40.179696 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-utilities/0.log" Feb 19 11:15:40 crc kubenswrapper[4873]: I0219 11:15:40.185424 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-content/0.log" Feb 19 11:15:40 crc kubenswrapper[4873]: I0219 11:15:40.447188 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/util/0.log" Feb 19 11:15:40 crc kubenswrapper[4873]: I0219 11:15:40.611068 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/util/0.log" Feb 19 11:15:40 crc kubenswrapper[4873]: I0219 11:15:40.690628 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/pull/0.log" Feb 19 11:15:40 crc kubenswrapper[4873]: I0219 11:15:40.701840 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/pull/0.log" Feb 19 11:15:40 crc kubenswrapper[4873]: I0219 11:15:40.919957 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/util/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.029843 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/extract/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.050965 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/pull/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.078667 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/registry-server/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.238064 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jt9rj_1d58439b-31c6-44df-a32d-48f0fcb6a361/marketplace-operator/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.285236 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-utilities/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.466158 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-content/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.480360 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-content/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.500517 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-utilities/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.674380 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-content/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.699573 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-utilities/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.864586 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/registry-server/0.log" Feb 19 11:15:41 crc kubenswrapper[4873]: I0219 11:15:41.921299 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-utilities/0.log" Feb 19 11:15:42 crc kubenswrapper[4873]: I0219 11:15:42.064075 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-content/0.log" Feb 19 11:15:42 crc kubenswrapper[4873]: I0219 11:15:42.067475 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-content/0.log" Feb 19 11:15:42 crc kubenswrapper[4873]: I0219 11:15:42.116413 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-utilities/0.log" Feb 19 11:15:42 crc kubenswrapper[4873]: I0219 11:15:42.240302 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-utilities/0.log" Feb 19 11:15:42 crc kubenswrapper[4873]: I0219 11:15:42.262265 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-content/0.log" Feb 19 11:15:42 crc kubenswrapper[4873]: I0219 11:15:42.915810 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/registry-server/0.log" Feb 19 11:15:55 crc kubenswrapper[4873]: I0219 11:15:55.869503 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7_3180318c-7d9a-454b-8de4-887fabae362b/prometheus-operator-admission-webhook/0.log" Feb 19 11:15:55 crc kubenswrapper[4873]: I0219 11:15:55.870410 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7496f9f864-qptdb_4724c979-0040-4017-86ce-78d2a8bdb44e/prometheus-operator-admission-webhook/0.log" Feb 19 11:15:55 crc kubenswrapper[4873]: I0219 11:15:55.903790 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-v7nww_5d79d4d8-e595-4aec-bc0b-7347b826c257/prometheus-operator/0.log" Feb 19 11:15:56 crc kubenswrapper[4873]: I0219 11:15:56.096227 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-7wtlv_b23281d2-935e-47c1-bc83-8d00c7649625/operator/0.log" Feb 19 11:15:56 crc kubenswrapper[4873]: I0219 11:15:56.107305 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-8sflg_ea1cc2c7-c932-4b3d-b718-d017eb06163f/perses-operator/0.log" Feb 19 11:16:03 crc kubenswrapper[4873]: I0219 11:16:03.701412 4873 scope.go:117] "RemoveContainer" containerID="97237f992c0b70ff79ac5f913c59bb566d1f47e053b9c438613bd77bb3e8a5fe" Feb 19 11:16:18 crc kubenswrapper[4873]: I0219 11:16:18.240944 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:16:18 crc kubenswrapper[4873]: I0219 11:16:18.242313 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:16:48 crc kubenswrapper[4873]: I0219 11:16:48.240927 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:16:48 crc kubenswrapper[4873]: I0219 11:16:48.241542 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:17:18 crc kubenswrapper[4873]: I0219 11:17:18.240782 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:17:18 crc kubenswrapper[4873]: I0219 11:17:18.241368 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:17:18 crc kubenswrapper[4873]: I0219 11:17:18.241420 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 11:17:18 crc kubenswrapper[4873]: I0219 11:17:18.242242 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b1849d703253da95bfe5a3436c40938b54212f4d26fc335188390020db850543"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 11:17:18 crc kubenswrapper[4873]: I0219 11:17:18.242293 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://b1849d703253da95bfe5a3436c40938b54212f4d26fc335188390020db850543" gracePeriod=600 Feb 19 11:17:19 crc kubenswrapper[4873]: I0219 11:17:19.058297 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="b1849d703253da95bfe5a3436c40938b54212f4d26fc335188390020db850543" exitCode=0 Feb 19 11:17:19 crc kubenswrapper[4873]: I0219 11:17:19.058418 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"b1849d703253da95bfe5a3436c40938b54212f4d26fc335188390020db850543"} Feb 19 11:17:19 crc kubenswrapper[4873]: I0219 11:17:19.058994 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838"} Feb 19 11:17:19 crc kubenswrapper[4873]: I0219 11:17:19.059044 4873 scope.go:117] "RemoveContainer" containerID="1c6d3e122fcbfb2819864013ec95d084efa34729bd03b5b61b19e8b2dae70bd7" Feb 19 11:18:03 crc kubenswrapper[4873]: I0219 11:18:03.836555 4873 scope.go:117] "RemoveContainer" containerID="e1e47bb1ff8a672d572432525165620c6fe8cdbb5878839e48d767115053def9" Feb 19 11:18:07 crc kubenswrapper[4873]: I0219 11:18:07.563529 4873 generic.go:334] "Generic (PLEG): container finished" podID="a6f4f1cb-6b34-4940-be18-6ba992fd72d7" containerID="ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7" exitCode=0 Feb 19 11:18:07 crc kubenswrapper[4873]: I0219 11:18:07.563601 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lms9s/must-gather-lgwst" event={"ID":"a6f4f1cb-6b34-4940-be18-6ba992fd72d7","Type":"ContainerDied","Data":"ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7"} Feb 19 11:18:07 crc kubenswrapper[4873]: I0219 11:18:07.564550 4873 scope.go:117] "RemoveContainer" containerID="ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7" Feb 19 11:18:07 crc kubenswrapper[4873]: I0219 11:18:07.921502 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lms9s_must-gather-lgwst_a6f4f1cb-6b34-4940-be18-6ba992fd72d7/gather/0.log" Feb 19 11:18:10 crc kubenswrapper[4873]: E0219 11:18:10.541147 4873 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.156:45014->38.102.83.156:45689: write tcp 38.102.83.156:45014->38.102.83.156:45689: write: broken pipe Feb 19 11:18:16 crc kubenswrapper[4873]: I0219 11:18:16.770156 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lms9s/must-gather-lgwst"] Feb 19 11:18:16 crc kubenswrapper[4873]: I0219 11:18:16.771043 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lms9s/must-gather-lgwst" podUID="a6f4f1cb-6b34-4940-be18-6ba992fd72d7" containerName="copy" containerID="cri-o://d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620" gracePeriod=2 Feb 19 11:18:16 crc kubenswrapper[4873]: I0219 11:18:16.780733 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lms9s/must-gather-lgwst"] Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.225589 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lms9s_must-gather-lgwst_a6f4f1cb-6b34-4940-be18-6ba992fd72d7/copy/0.log" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.226324 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/must-gather-lgwst" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.368888 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-must-gather-output\") pod \"a6f4f1cb-6b34-4940-be18-6ba992fd72d7\" (UID: \"a6f4f1cb-6b34-4940-be18-6ba992fd72d7\") " Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.368979 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9b49\" (UniqueName: \"kubernetes.io/projected/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-kube-api-access-n9b49\") pod \"a6f4f1cb-6b34-4940-be18-6ba992fd72d7\" (UID: \"a6f4f1cb-6b34-4940-be18-6ba992fd72d7\") " Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.375644 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-kube-api-access-n9b49" (OuterVolumeSpecName: "kube-api-access-n9b49") pod "a6f4f1cb-6b34-4940-be18-6ba992fd72d7" (UID: "a6f4f1cb-6b34-4940-be18-6ba992fd72d7"). InnerVolumeSpecName "kube-api-access-n9b49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.472705 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9b49\" (UniqueName: \"kubernetes.io/projected/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-kube-api-access-n9b49\") on node \"crc\" DevicePath \"\"" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.581764 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a6f4f1cb-6b34-4940-be18-6ba992fd72d7" (UID: "a6f4f1cb-6b34-4940-be18-6ba992fd72d7"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.676720 4873 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6f4f1cb-6b34-4940-be18-6ba992fd72d7-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.683056 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lms9s_must-gather-lgwst_a6f4f1cb-6b34-4940-be18-6ba992fd72d7/copy/0.log" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.684561 4873 generic.go:334] "Generic (PLEG): container finished" podID="a6f4f1cb-6b34-4940-be18-6ba992fd72d7" containerID="d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620" exitCode=143 Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.684623 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lms9s/must-gather-lgwst" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.684647 4873 scope.go:117] "RemoveContainer" containerID="d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.725801 4873 scope.go:117] "RemoveContainer" containerID="ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.768641 4873 scope.go:117] "RemoveContainer" containerID="d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620" Feb 19 11:18:17 crc kubenswrapper[4873]: E0219 11:18:17.770811 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620\": container with ID starting with d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620 not found: ID does not exist" containerID="d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.770877 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620"} err="failed to get container status \"d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620\": rpc error: code = NotFound desc = could not find container \"d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620\": container with ID starting with d21919709c4c5fc4cd8c1920eab8a780a95e4fd90dfbc7716f574f51697d2620 not found: ID does not exist" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.770912 4873 scope.go:117] "RemoveContainer" containerID="ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7" Feb 19 11:18:17 crc kubenswrapper[4873]: E0219 11:18:17.773561 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7\": container with ID starting with ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7 not found: ID does not exist" containerID="ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7" Feb 19 11:18:17 crc kubenswrapper[4873]: I0219 11:18:17.773652 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7"} err="failed to get container status \"ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7\": rpc error: code = NotFound desc = could not find container \"ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7\": container with ID starting with ec0556d441d0438d8f010526f2d7e9ab2d6477bf9ac4834a65568abbbc6e94c7 not found: ID does not exist" Feb 19 11:18:19 crc kubenswrapper[4873]: I0219 11:18:19.496450 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f4f1cb-6b34-4940-be18-6ba992fd72d7" path="/var/lib/kubelet/pods/a6f4f1cb-6b34-4940-be18-6ba992fd72d7/volumes" Feb 19 11:19:03 crc kubenswrapper[4873]: I0219 11:19:03.906414 4873 scope.go:117] "RemoveContainer" containerID="9881fb46a0d2568f8dddfc8da179e6f97035508c7f4131e1a48dad7a6c2fd139" Feb 19 11:19:18 crc kubenswrapper[4873]: I0219 11:19:18.240518 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:19:18 crc kubenswrapper[4873]: I0219 11:19:18.241059 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:19:48 crc kubenswrapper[4873]: I0219 11:19:48.241068 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:19:48 crc kubenswrapper[4873]: I0219 11:19:48.241732 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:20:18 crc kubenswrapper[4873]: I0219 11:20:18.240935 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:20:18 crc kubenswrapper[4873]: I0219 11:20:18.241530 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:20:18 crc kubenswrapper[4873]: I0219 11:20:18.241578 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 11:20:18 crc kubenswrapper[4873]: I0219 11:20:18.242337 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 11:20:18 crc kubenswrapper[4873]: I0219 11:20:18.242388 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" gracePeriod=600 Feb 19 11:20:18 crc kubenswrapper[4873]: E0219 11:20:18.366260 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:20:18 crc kubenswrapper[4873]: I0219 11:20:18.927202 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" exitCode=0 Feb 19 11:20:18 crc kubenswrapper[4873]: I0219 11:20:18.927287 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838"} Feb 19 11:20:18 crc kubenswrapper[4873]: I0219 11:20:18.927591 4873 scope.go:117] "RemoveContainer" containerID="b1849d703253da95bfe5a3436c40938b54212f4d26fc335188390020db850543" Feb 19 11:20:18 crc kubenswrapper[4873]: I0219 11:20:18.928398 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:20:18 crc kubenswrapper[4873]: E0219 11:20:18.928727 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.720644 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2mprj"] Feb 19 11:20:21 crc kubenswrapper[4873]: E0219 11:20:21.721983 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f4f1cb-6b34-4940-be18-6ba992fd72d7" containerName="gather" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.722007 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f4f1cb-6b34-4940-be18-6ba992fd72d7" containerName="gather" Feb 19 11:20:21 crc kubenswrapper[4873]: E0219 11:20:21.722034 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b5347f-ec96-4e47-a667-286f7e382b01" containerName="extract-utilities" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.722046 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b5347f-ec96-4e47-a667-286f7e382b01" containerName="extract-utilities" Feb 19 11:20:21 crc kubenswrapper[4873]: E0219 11:20:21.722075 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b5347f-ec96-4e47-a667-286f7e382b01" containerName="registry-server" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.722087 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b5347f-ec96-4e47-a667-286f7e382b01" containerName="registry-server" Feb 19 11:20:21 crc kubenswrapper[4873]: E0219 11:20:21.722137 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f4f1cb-6b34-4940-be18-6ba992fd72d7" containerName="copy" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.722148 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f4f1cb-6b34-4940-be18-6ba992fd72d7" containerName="copy" Feb 19 11:20:21 crc kubenswrapper[4873]: E0219 11:20:21.722191 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b5347f-ec96-4e47-a667-286f7e382b01" containerName="extract-content" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.722203 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b5347f-ec96-4e47-a667-286f7e382b01" containerName="extract-content" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.722650 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f4f1cb-6b34-4940-be18-6ba992fd72d7" containerName="gather" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.722673 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f4f1cb-6b34-4940-be18-6ba992fd72d7" containerName="copy" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.722689 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b5347f-ec96-4e47-a667-286f7e382b01" containerName="registry-server" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.725156 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.731339 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2mprj"] Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.864118 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9nqw\" (UniqueName: \"kubernetes.io/projected/a751696c-f9c4-4ab3-aba8-95342fed53a4-kube-api-access-j9nqw\") pod \"certified-operators-2mprj\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.864162 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-catalog-content\") pod \"certified-operators-2mprj\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.864496 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-utilities\") pod \"certified-operators-2mprj\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.966996 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-utilities\") pod \"certified-operators-2mprj\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.967153 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9nqw\" (UniqueName: \"kubernetes.io/projected/a751696c-f9c4-4ab3-aba8-95342fed53a4-kube-api-access-j9nqw\") pod \"certified-operators-2mprj\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.967173 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-catalog-content\") pod \"certified-operators-2mprj\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.967609 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-catalog-content\") pod \"certified-operators-2mprj\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:21 crc kubenswrapper[4873]: I0219 11:20:21.967819 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-utilities\") pod \"certified-operators-2mprj\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:22 crc kubenswrapper[4873]: I0219 11:20:22.014410 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9nqw\" (UniqueName: \"kubernetes.io/projected/a751696c-f9c4-4ab3-aba8-95342fed53a4-kube-api-access-j9nqw\") pod \"certified-operators-2mprj\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:22 crc kubenswrapper[4873]: I0219 11:20:22.058741 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:22 crc kubenswrapper[4873]: I0219 11:20:22.617003 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2mprj"] Feb 19 11:20:22 crc kubenswrapper[4873]: I0219 11:20:22.975473 4873 generic.go:334] "Generic (PLEG): container finished" podID="a751696c-f9c4-4ab3-aba8-95342fed53a4" containerID="88112815c5110d49bf37ce2a6f63b237b5591a8d87cfd8400a25ea8171753c84" exitCode=0 Feb 19 11:20:22 crc kubenswrapper[4873]: I0219 11:20:22.975520 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mprj" event={"ID":"a751696c-f9c4-4ab3-aba8-95342fed53a4","Type":"ContainerDied","Data":"88112815c5110d49bf37ce2a6f63b237b5591a8d87cfd8400a25ea8171753c84"} Feb 19 11:20:22 crc kubenswrapper[4873]: I0219 11:20:22.975563 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mprj" event={"ID":"a751696c-f9c4-4ab3-aba8-95342fed53a4","Type":"ContainerStarted","Data":"037a9073984a8a4475faa5e356612016fb33bcecce2615925d804f9f77b81660"} Feb 19 11:20:22 crc kubenswrapper[4873]: I0219 11:20:22.979494 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 11:20:24 crc kubenswrapper[4873]: I0219 11:20:24.995677 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mprj" event={"ID":"a751696c-f9c4-4ab3-aba8-95342fed53a4","Type":"ContainerStarted","Data":"0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b"} Feb 19 11:20:26 crc kubenswrapper[4873]: I0219 11:20:26.007490 4873 generic.go:334] "Generic (PLEG): container finished" podID="a751696c-f9c4-4ab3-aba8-95342fed53a4" containerID="0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b" exitCode=0 Feb 19 11:20:26 crc kubenswrapper[4873]: I0219 11:20:26.007597 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mprj" event={"ID":"a751696c-f9c4-4ab3-aba8-95342fed53a4","Type":"ContainerDied","Data":"0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b"} Feb 19 11:20:27 crc kubenswrapper[4873]: I0219 11:20:27.036421 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mprj" event={"ID":"a751696c-f9c4-4ab3-aba8-95342fed53a4","Type":"ContainerStarted","Data":"4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51"} Feb 19 11:20:27 crc kubenswrapper[4873]: I0219 11:20:27.077749 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2mprj" podStartSLOduration=2.6368362579999998 podStartE2EDuration="6.077722971s" podCreationTimestamp="2026-02-19 11:20:21 +0000 UTC" firstStartedPulling="2026-02-19 11:20:22.979185138 +0000 UTC m=+5732.268616786" lastFinishedPulling="2026-02-19 11:20:26.420071861 +0000 UTC m=+5735.709503499" observedRunningTime="2026-02-19 11:20:27.067503054 +0000 UTC m=+5736.356934692" watchObservedRunningTime="2026-02-19 11:20:27.077722971 +0000 UTC m=+5736.367154609" Feb 19 11:20:31 crc kubenswrapper[4873]: I0219 11:20:31.492961 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:20:31 crc kubenswrapper[4873]: E0219 11:20:31.494021 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:20:32 crc kubenswrapper[4873]: I0219 11:20:32.059322 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:32 crc kubenswrapper[4873]: I0219 11:20:32.059369 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:32 crc kubenswrapper[4873]: I0219 11:20:32.140513 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:32 crc kubenswrapper[4873]: I0219 11:20:32.226381 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:32 crc kubenswrapper[4873]: I0219 11:20:32.408891 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2mprj"] Feb 19 11:20:34 crc kubenswrapper[4873]: I0219 11:20:34.105549 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2mprj" podUID="a751696c-f9c4-4ab3-aba8-95342fed53a4" containerName="registry-server" containerID="cri-o://4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51" gracePeriod=2 Feb 19 11:20:34 crc kubenswrapper[4873]: I0219 11:20:34.569169 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:34 crc kubenswrapper[4873]: I0219 11:20:34.653663 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-catalog-content\") pod \"a751696c-f9c4-4ab3-aba8-95342fed53a4\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " Feb 19 11:20:34 crc kubenswrapper[4873]: I0219 11:20:34.653870 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-utilities\") pod \"a751696c-f9c4-4ab3-aba8-95342fed53a4\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " Feb 19 11:20:34 crc kubenswrapper[4873]: I0219 11:20:34.653900 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9nqw\" (UniqueName: \"kubernetes.io/projected/a751696c-f9c4-4ab3-aba8-95342fed53a4-kube-api-access-j9nqw\") pod \"a751696c-f9c4-4ab3-aba8-95342fed53a4\" (UID: \"a751696c-f9c4-4ab3-aba8-95342fed53a4\") " Feb 19 11:20:34 crc kubenswrapper[4873]: I0219 11:20:34.657888 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-utilities" (OuterVolumeSpecName: "utilities") pod "a751696c-f9c4-4ab3-aba8-95342fed53a4" (UID: "a751696c-f9c4-4ab3-aba8-95342fed53a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:20:34 crc kubenswrapper[4873]: I0219 11:20:34.661095 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a751696c-f9c4-4ab3-aba8-95342fed53a4-kube-api-access-j9nqw" (OuterVolumeSpecName: "kube-api-access-j9nqw") pod "a751696c-f9c4-4ab3-aba8-95342fed53a4" (UID: "a751696c-f9c4-4ab3-aba8-95342fed53a4"). InnerVolumeSpecName "kube-api-access-j9nqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:20:34 crc kubenswrapper[4873]: I0219 11:20:34.757020 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:20:34 crc kubenswrapper[4873]: I0219 11:20:34.757050 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9nqw\" (UniqueName: \"kubernetes.io/projected/a751696c-f9c4-4ab3-aba8-95342fed53a4-kube-api-access-j9nqw\") on node \"crc\" DevicePath \"\"" Feb 19 11:20:34 crc kubenswrapper[4873]: I0219 11:20:34.824008 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a751696c-f9c4-4ab3-aba8-95342fed53a4" (UID: "a751696c-f9c4-4ab3-aba8-95342fed53a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:20:34 crc kubenswrapper[4873]: I0219 11:20:34.858707 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751696c-f9c4-4ab3-aba8-95342fed53a4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.115508 4873 generic.go:334] "Generic (PLEG): container finished" podID="a751696c-f9c4-4ab3-aba8-95342fed53a4" containerID="4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51" exitCode=0 Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.115566 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mprj" event={"ID":"a751696c-f9c4-4ab3-aba8-95342fed53a4","Type":"ContainerDied","Data":"4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51"} Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.115598 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mprj" event={"ID":"a751696c-f9c4-4ab3-aba8-95342fed53a4","Type":"ContainerDied","Data":"037a9073984a8a4475faa5e356612016fb33bcecce2615925d804f9f77b81660"} Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.115616 4873 scope.go:117] "RemoveContainer" containerID="4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51" Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.115795 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mprj" Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.142577 4873 scope.go:117] "RemoveContainer" containerID="0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b" Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.153463 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2mprj"] Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.162216 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2mprj"] Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.165689 4873 scope.go:117] "RemoveContainer" containerID="88112815c5110d49bf37ce2a6f63b237b5591a8d87cfd8400a25ea8171753c84" Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.214087 4873 scope.go:117] "RemoveContainer" containerID="4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51" Feb 19 11:20:35 crc kubenswrapper[4873]: E0219 11:20:35.214541 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51\": container with ID starting with 4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51 not found: ID does not exist" containerID="4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51" Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.214598 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51"} err="failed to get container status \"4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51\": rpc error: code = NotFound desc = could not find container \"4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51\": container with ID starting with 4025b1756223afbc418f2a5c83f4a05f80b7c4b8565270105b193a686fe90f51 not found: ID does not exist" Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.214643 4873 scope.go:117] "RemoveContainer" containerID="0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b" Feb 19 11:20:35 crc kubenswrapper[4873]: E0219 11:20:35.214992 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b\": container with ID starting with 0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b not found: ID does not exist" containerID="0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b" Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.215032 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b"} err="failed to get container status \"0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b\": rpc error: code = NotFound desc = could not find container \"0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b\": container with ID starting with 0373cd844ad41c1f4e277a296c09e9b2f4f251b01a9feaa92bc83413e087803b not found: ID does not exist" Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.215057 4873 scope.go:117] "RemoveContainer" containerID="88112815c5110d49bf37ce2a6f63b237b5591a8d87cfd8400a25ea8171753c84" Feb 19 11:20:35 crc kubenswrapper[4873]: E0219 11:20:35.215373 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88112815c5110d49bf37ce2a6f63b237b5591a8d87cfd8400a25ea8171753c84\": container with ID starting with 88112815c5110d49bf37ce2a6f63b237b5591a8d87cfd8400a25ea8171753c84 not found: ID does not exist" containerID="88112815c5110d49bf37ce2a6f63b237b5591a8d87cfd8400a25ea8171753c84" Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.215408 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88112815c5110d49bf37ce2a6f63b237b5591a8d87cfd8400a25ea8171753c84"} err="failed to get container status \"88112815c5110d49bf37ce2a6f63b237b5591a8d87cfd8400a25ea8171753c84\": rpc error: code = NotFound desc = could not find container \"88112815c5110d49bf37ce2a6f63b237b5591a8d87cfd8400a25ea8171753c84\": container with ID starting with 88112815c5110d49bf37ce2a6f63b237b5591a8d87cfd8400a25ea8171753c84 not found: ID does not exist" Feb 19 11:20:35 crc kubenswrapper[4873]: I0219 11:20:35.501980 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a751696c-f9c4-4ab3-aba8-95342fed53a4" path="/var/lib/kubelet/pods/a751696c-f9c4-4ab3-aba8-95342fed53a4/volumes" Feb 19 11:20:44 crc kubenswrapper[4873]: I0219 11:20:44.484403 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:20:44 crc kubenswrapper[4873]: E0219 11:20:44.485070 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:20:57 crc kubenswrapper[4873]: I0219 11:20:57.484065 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:20:57 crc kubenswrapper[4873]: E0219 11:20:57.485010 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:21:04 crc kubenswrapper[4873]: I0219 11:21:04.016840 4873 scope.go:117] "RemoveContainer" containerID="b30980f32b2645748e6c10b7a5e9191a7ed287e91ec99d6202bbad12353be541" Feb 19 11:21:04 crc kubenswrapper[4873]: I0219 11:21:04.048277 4873 scope.go:117] "RemoveContainer" containerID="97f52c7b6a3d35760dce1e3eea42c096bc9ea72133b589b617ac83afe8756696" Feb 19 11:21:04 crc kubenswrapper[4873]: I0219 11:21:04.092785 4873 scope.go:117] "RemoveContainer" containerID="981c3b34f5f254eaaf4de449435114b331f5b9ed7452571e8eef21054d499734" Feb 19 11:21:09 crc kubenswrapper[4873]: I0219 11:21:09.485250 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:21:09 crc kubenswrapper[4873]: E0219 11:21:09.485972 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:21:23 crc kubenswrapper[4873]: I0219 11:21:23.484439 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:21:23 crc kubenswrapper[4873]: E0219 11:21:23.485276 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:21:34 crc kubenswrapper[4873]: I0219 11:21:34.484430 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:21:34 crc kubenswrapper[4873]: E0219 11:21:34.486468 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:21:48 crc kubenswrapper[4873]: I0219 11:21:48.483989 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:21:48 crc kubenswrapper[4873]: E0219 11:21:48.484843 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.141449 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qhpxg/must-gather-vhpx8"] Feb 19 11:21:57 crc kubenswrapper[4873]: E0219 11:21:57.143686 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a751696c-f9c4-4ab3-aba8-95342fed53a4" containerName="registry-server" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.143830 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a751696c-f9c4-4ab3-aba8-95342fed53a4" containerName="registry-server" Feb 19 11:21:57 crc kubenswrapper[4873]: E0219 11:21:57.143931 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a751696c-f9c4-4ab3-aba8-95342fed53a4" containerName="extract-content" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.144013 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a751696c-f9c4-4ab3-aba8-95342fed53a4" containerName="extract-content" Feb 19 11:21:57 crc kubenswrapper[4873]: E0219 11:21:57.144116 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a751696c-f9c4-4ab3-aba8-95342fed53a4" containerName="extract-utilities" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.144199 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="a751696c-f9c4-4ab3-aba8-95342fed53a4" containerName="extract-utilities" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.144547 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="a751696c-f9c4-4ab3-aba8-95342fed53a4" containerName="registry-server" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.146894 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/must-gather-vhpx8" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.159863 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qhpxg"/"openshift-service-ca.crt" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.160205 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qhpxg"/"kube-root-ca.crt" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.171421 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qhpxg/must-gather-vhpx8"] Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.194490 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93cc0682-3903-4dad-a4a1-3e807492bab4-must-gather-output\") pod \"must-gather-vhpx8\" (UID: \"93cc0682-3903-4dad-a4a1-3e807492bab4\") " pod="openshift-must-gather-qhpxg/must-gather-vhpx8" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.194577 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfvlt\" (UniqueName: \"kubernetes.io/projected/93cc0682-3903-4dad-a4a1-3e807492bab4-kube-api-access-nfvlt\") pod \"must-gather-vhpx8\" (UID: \"93cc0682-3903-4dad-a4a1-3e807492bab4\") " pod="openshift-must-gather-qhpxg/must-gather-vhpx8" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.296227 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93cc0682-3903-4dad-a4a1-3e807492bab4-must-gather-output\") pod \"must-gather-vhpx8\" (UID: \"93cc0682-3903-4dad-a4a1-3e807492bab4\") " pod="openshift-must-gather-qhpxg/must-gather-vhpx8" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.296314 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfvlt\" (UniqueName: \"kubernetes.io/projected/93cc0682-3903-4dad-a4a1-3e807492bab4-kube-api-access-nfvlt\") pod \"must-gather-vhpx8\" (UID: \"93cc0682-3903-4dad-a4a1-3e807492bab4\") " pod="openshift-must-gather-qhpxg/must-gather-vhpx8" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.296951 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93cc0682-3903-4dad-a4a1-3e807492bab4-must-gather-output\") pod \"must-gather-vhpx8\" (UID: \"93cc0682-3903-4dad-a4a1-3e807492bab4\") " pod="openshift-must-gather-qhpxg/must-gather-vhpx8" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.322385 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfvlt\" (UniqueName: \"kubernetes.io/projected/93cc0682-3903-4dad-a4a1-3e807492bab4-kube-api-access-nfvlt\") pod \"must-gather-vhpx8\" (UID: \"93cc0682-3903-4dad-a4a1-3e807492bab4\") " pod="openshift-must-gather-qhpxg/must-gather-vhpx8" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.485174 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/must-gather-vhpx8" Feb 19 11:21:57 crc kubenswrapper[4873]: I0219 11:21:57.960670 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qhpxg/must-gather-vhpx8"] Feb 19 11:21:58 crc kubenswrapper[4873]: I0219 11:21:58.599845 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/must-gather-vhpx8" event={"ID":"93cc0682-3903-4dad-a4a1-3e807492bab4","Type":"ContainerStarted","Data":"767ace8d23069d52fe292289a53031e59a4a02afb307e1f96188d5747c12d9df"} Feb 19 11:21:58 crc kubenswrapper[4873]: I0219 11:21:58.600355 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/must-gather-vhpx8" event={"ID":"93cc0682-3903-4dad-a4a1-3e807492bab4","Type":"ContainerStarted","Data":"ff3687756e2207c400bf2bbbc9410e3f3ee429ff1507157450e53ef945e6af06"} Feb 19 11:21:58 crc kubenswrapper[4873]: I0219 11:21:58.600370 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/must-gather-vhpx8" event={"ID":"93cc0682-3903-4dad-a4a1-3e807492bab4","Type":"ContainerStarted","Data":"889f3805eaa8890505b91002548c1d0a4c1f2dd9fd0077ad36b03032e4ca0a94"} Feb 19 11:21:58 crc kubenswrapper[4873]: I0219 11:21:58.621349 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qhpxg/must-gather-vhpx8" podStartSLOduration=1.6213263580000001 podStartE2EDuration="1.621326358s" podCreationTimestamp="2026-02-19 11:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 11:21:58.618744123 +0000 UTC m=+5827.908175761" watchObservedRunningTime="2026-02-19 11:21:58.621326358 +0000 UTC m=+5827.910758006" Feb 19 11:22:00 crc kubenswrapper[4873]: I0219 11:22:00.484562 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:22:00 crc kubenswrapper[4873]: E0219 11:22:00.485395 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:22:02 crc kubenswrapper[4873]: I0219 11:22:02.124637 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qhpxg/crc-debug-w28cl"] Feb 19 11:22:02 crc kubenswrapper[4873]: I0219 11:22:02.127692 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-w28cl" Feb 19 11:22:02 crc kubenswrapper[4873]: I0219 11:22:02.130045 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qhpxg"/"default-dockercfg-v9qxb" Feb 19 11:22:02 crc kubenswrapper[4873]: I0219 11:22:02.204232 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-host\") pod \"crc-debug-w28cl\" (UID: \"bb42ac03-78d0-4edc-bfd9-c248a7970fa5\") " pod="openshift-must-gather-qhpxg/crc-debug-w28cl" Feb 19 11:22:02 crc kubenswrapper[4873]: I0219 11:22:02.204299 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvb8w\" (UniqueName: \"kubernetes.io/projected/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-kube-api-access-rvb8w\") pod \"crc-debug-w28cl\" (UID: \"bb42ac03-78d0-4edc-bfd9-c248a7970fa5\") " pod="openshift-must-gather-qhpxg/crc-debug-w28cl" Feb 19 11:22:02 crc kubenswrapper[4873]: I0219 11:22:02.306372 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-host\") pod \"crc-debug-w28cl\" (UID: \"bb42ac03-78d0-4edc-bfd9-c248a7970fa5\") " pod="openshift-must-gather-qhpxg/crc-debug-w28cl" Feb 19 11:22:02 crc kubenswrapper[4873]: I0219 11:22:02.306428 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvb8w\" (UniqueName: \"kubernetes.io/projected/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-kube-api-access-rvb8w\") pod \"crc-debug-w28cl\" (UID: \"bb42ac03-78d0-4edc-bfd9-c248a7970fa5\") " pod="openshift-must-gather-qhpxg/crc-debug-w28cl" Feb 19 11:22:02 crc kubenswrapper[4873]: I0219 11:22:02.306998 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-host\") pod \"crc-debug-w28cl\" (UID: \"bb42ac03-78d0-4edc-bfd9-c248a7970fa5\") " pod="openshift-must-gather-qhpxg/crc-debug-w28cl" Feb 19 11:22:02 crc kubenswrapper[4873]: I0219 11:22:02.344215 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvb8w\" (UniqueName: \"kubernetes.io/projected/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-kube-api-access-rvb8w\") pod \"crc-debug-w28cl\" (UID: \"bb42ac03-78d0-4edc-bfd9-c248a7970fa5\") " pod="openshift-must-gather-qhpxg/crc-debug-w28cl" Feb 19 11:22:02 crc kubenswrapper[4873]: I0219 11:22:02.449250 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-w28cl" Feb 19 11:22:02 crc kubenswrapper[4873]: W0219 11:22:02.475078 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb42ac03_78d0_4edc_bfd9_c248a7970fa5.slice/crio-3c959f9ef4fd551afacdabb17d9558a5c14d2556363a624e3f2ba4b934e76c0e WatchSource:0}: Error finding container 3c959f9ef4fd551afacdabb17d9558a5c14d2556363a624e3f2ba4b934e76c0e: Status 404 returned error can't find the container with id 3c959f9ef4fd551afacdabb17d9558a5c14d2556363a624e3f2ba4b934e76c0e Feb 19 11:22:02 crc kubenswrapper[4873]: I0219 11:22:02.637077 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/crc-debug-w28cl" event={"ID":"bb42ac03-78d0-4edc-bfd9-c248a7970fa5","Type":"ContainerStarted","Data":"3c959f9ef4fd551afacdabb17d9558a5c14d2556363a624e3f2ba4b934e76c0e"} Feb 19 11:22:03 crc kubenswrapper[4873]: I0219 11:22:03.647003 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/crc-debug-w28cl" event={"ID":"bb42ac03-78d0-4edc-bfd9-c248a7970fa5","Type":"ContainerStarted","Data":"d9b27392492b242bb0b7cf96897d9ede2e48fff749603f8472748bdf9b9a9549"} Feb 19 11:22:03 crc kubenswrapper[4873]: I0219 11:22:03.676378 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qhpxg/crc-debug-w28cl" podStartSLOduration=1.6763591039999999 podStartE2EDuration="1.676359104s" podCreationTimestamp="2026-02-19 11:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 11:22:03.666600029 +0000 UTC m=+5832.956031677" watchObservedRunningTime="2026-02-19 11:22:03.676359104 +0000 UTC m=+5832.965790742" Feb 19 11:22:15 crc kubenswrapper[4873]: I0219 11:22:15.484439 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:22:15 crc kubenswrapper[4873]: E0219 11:22:15.485256 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:22:28 crc kubenswrapper[4873]: I0219 11:22:28.484293 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:22:28 crc kubenswrapper[4873]: E0219 11:22:28.485124 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:22:40 crc kubenswrapper[4873]: I0219 11:22:40.484018 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:22:40 crc kubenswrapper[4873]: E0219 11:22:40.484751 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:22:43 crc kubenswrapper[4873]: I0219 11:22:43.992228 4873 generic.go:334] "Generic (PLEG): container finished" podID="bb42ac03-78d0-4edc-bfd9-c248a7970fa5" containerID="d9b27392492b242bb0b7cf96897d9ede2e48fff749603f8472748bdf9b9a9549" exitCode=0 Feb 19 11:22:43 crc kubenswrapper[4873]: I0219 11:22:43.992336 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/crc-debug-w28cl" event={"ID":"bb42ac03-78d0-4edc-bfd9-c248a7970fa5","Type":"ContainerDied","Data":"d9b27392492b242bb0b7cf96897d9ede2e48fff749603f8472748bdf9b9a9549"} Feb 19 11:22:45 crc kubenswrapper[4873]: I0219 11:22:45.122234 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-w28cl" Feb 19 11:22:45 crc kubenswrapper[4873]: I0219 11:22:45.157494 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qhpxg/crc-debug-w28cl"] Feb 19 11:22:45 crc kubenswrapper[4873]: I0219 11:22:45.167839 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qhpxg/crc-debug-w28cl"] Feb 19 11:22:45 crc kubenswrapper[4873]: I0219 11:22:45.197612 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvb8w\" (UniqueName: \"kubernetes.io/projected/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-kube-api-access-rvb8w\") pod \"bb42ac03-78d0-4edc-bfd9-c248a7970fa5\" (UID: \"bb42ac03-78d0-4edc-bfd9-c248a7970fa5\") " Feb 19 11:22:45 crc kubenswrapper[4873]: I0219 11:22:45.197692 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-host\") pod \"bb42ac03-78d0-4edc-bfd9-c248a7970fa5\" (UID: \"bb42ac03-78d0-4edc-bfd9-c248a7970fa5\") " Feb 19 11:22:45 crc kubenswrapper[4873]: I0219 11:22:45.197781 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-host" (OuterVolumeSpecName: "host") pod "bb42ac03-78d0-4edc-bfd9-c248a7970fa5" (UID: "bb42ac03-78d0-4edc-bfd9-c248a7970fa5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 11:22:45 crc kubenswrapper[4873]: I0219 11:22:45.199723 4873 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-host\") on node \"crc\" DevicePath \"\"" Feb 19 11:22:45 crc kubenswrapper[4873]: I0219 11:22:45.217485 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-kube-api-access-rvb8w" (OuterVolumeSpecName: "kube-api-access-rvb8w") pod "bb42ac03-78d0-4edc-bfd9-c248a7970fa5" (UID: "bb42ac03-78d0-4edc-bfd9-c248a7970fa5"). InnerVolumeSpecName "kube-api-access-rvb8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:22:45 crc kubenswrapper[4873]: I0219 11:22:45.302299 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvb8w\" (UniqueName: \"kubernetes.io/projected/bb42ac03-78d0-4edc-bfd9-c248a7970fa5-kube-api-access-rvb8w\") on node \"crc\" DevicePath \"\"" Feb 19 11:22:45 crc kubenswrapper[4873]: I0219 11:22:45.494756 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb42ac03-78d0-4edc-bfd9-c248a7970fa5" path="/var/lib/kubelet/pods/bb42ac03-78d0-4edc-bfd9-c248a7970fa5/volumes" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.013619 4873 scope.go:117] "RemoveContainer" containerID="d9b27392492b242bb0b7cf96897d9ede2e48fff749603f8472748bdf9b9a9549" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.013688 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-w28cl" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.406291 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qhpxg/crc-debug-bpp7f"] Feb 19 11:22:46 crc kubenswrapper[4873]: E0219 11:22:46.411862 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb42ac03-78d0-4edc-bfd9-c248a7970fa5" containerName="container-00" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.412128 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb42ac03-78d0-4edc-bfd9-c248a7970fa5" containerName="container-00" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.412536 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb42ac03-78d0-4edc-bfd9-c248a7970fa5" containerName="container-00" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.413451 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.415797 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qhpxg"/"default-dockercfg-v9qxb" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.525130 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55a0c77b-8e5a-4b18-8361-b672a9d394fb-host\") pod \"crc-debug-bpp7f\" (UID: \"55a0c77b-8e5a-4b18-8361-b672a9d394fb\") " pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.525296 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwjps\" (UniqueName: \"kubernetes.io/projected/55a0c77b-8e5a-4b18-8361-b672a9d394fb-kube-api-access-lwjps\") pod \"crc-debug-bpp7f\" (UID: \"55a0c77b-8e5a-4b18-8361-b672a9d394fb\") " pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.626549 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55a0c77b-8e5a-4b18-8361-b672a9d394fb-host\") pod \"crc-debug-bpp7f\" (UID: \"55a0c77b-8e5a-4b18-8361-b672a9d394fb\") " pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.626692 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwjps\" (UniqueName: \"kubernetes.io/projected/55a0c77b-8e5a-4b18-8361-b672a9d394fb-kube-api-access-lwjps\") pod \"crc-debug-bpp7f\" (UID: \"55a0c77b-8e5a-4b18-8361-b672a9d394fb\") " pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.626713 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55a0c77b-8e5a-4b18-8361-b672a9d394fb-host\") pod \"crc-debug-bpp7f\" (UID: \"55a0c77b-8e5a-4b18-8361-b672a9d394fb\") " pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.649941 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwjps\" (UniqueName: \"kubernetes.io/projected/55a0c77b-8e5a-4b18-8361-b672a9d394fb-kube-api-access-lwjps\") pod \"crc-debug-bpp7f\" (UID: \"55a0c77b-8e5a-4b18-8361-b672a9d394fb\") " pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" Feb 19 11:22:46 crc kubenswrapper[4873]: I0219 11:22:46.731003 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" Feb 19 11:22:47 crc kubenswrapper[4873]: I0219 11:22:47.023892 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" event={"ID":"55a0c77b-8e5a-4b18-8361-b672a9d394fb","Type":"ContainerStarted","Data":"c1a3ce9c18b56e83a8ab74c08f3f38d867ea9e51b90994bc784d9d1d2d171d1d"} Feb 19 11:22:47 crc kubenswrapper[4873]: I0219 11:22:47.024328 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" event={"ID":"55a0c77b-8e5a-4b18-8361-b672a9d394fb","Type":"ContainerStarted","Data":"9f80af5e3d39e35b511c4a0fe0f40231a807314542902ce320d2460730d20861"} Feb 19 11:22:48 crc kubenswrapper[4873]: I0219 11:22:48.036450 4873 generic.go:334] "Generic (PLEG): container finished" podID="55a0c77b-8e5a-4b18-8361-b672a9d394fb" containerID="c1a3ce9c18b56e83a8ab74c08f3f38d867ea9e51b90994bc784d9d1d2d171d1d" exitCode=0 Feb 19 11:22:48 crc kubenswrapper[4873]: I0219 11:22:48.036540 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" event={"ID":"55a0c77b-8e5a-4b18-8361-b672a9d394fb","Type":"ContainerDied","Data":"c1a3ce9c18b56e83a8ab74c08f3f38d867ea9e51b90994bc784d9d1d2d171d1d"} Feb 19 11:22:49 crc kubenswrapper[4873]: I0219 11:22:49.166944 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" Feb 19 11:22:49 crc kubenswrapper[4873]: I0219 11:22:49.279277 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwjps\" (UniqueName: \"kubernetes.io/projected/55a0c77b-8e5a-4b18-8361-b672a9d394fb-kube-api-access-lwjps\") pod \"55a0c77b-8e5a-4b18-8361-b672a9d394fb\" (UID: \"55a0c77b-8e5a-4b18-8361-b672a9d394fb\") " Feb 19 11:22:49 crc kubenswrapper[4873]: I0219 11:22:49.279455 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55a0c77b-8e5a-4b18-8361-b672a9d394fb-host\") pod \"55a0c77b-8e5a-4b18-8361-b672a9d394fb\" (UID: \"55a0c77b-8e5a-4b18-8361-b672a9d394fb\") " Feb 19 11:22:49 crc kubenswrapper[4873]: I0219 11:22:49.279509 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55a0c77b-8e5a-4b18-8361-b672a9d394fb-host" (OuterVolumeSpecName: "host") pod "55a0c77b-8e5a-4b18-8361-b672a9d394fb" (UID: "55a0c77b-8e5a-4b18-8361-b672a9d394fb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 11:22:49 crc kubenswrapper[4873]: I0219 11:22:49.280004 4873 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55a0c77b-8e5a-4b18-8361-b672a9d394fb-host\") on node \"crc\" DevicePath \"\"" Feb 19 11:22:49 crc kubenswrapper[4873]: I0219 11:22:49.285918 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a0c77b-8e5a-4b18-8361-b672a9d394fb-kube-api-access-lwjps" (OuterVolumeSpecName: "kube-api-access-lwjps") pod "55a0c77b-8e5a-4b18-8361-b672a9d394fb" (UID: "55a0c77b-8e5a-4b18-8361-b672a9d394fb"). InnerVolumeSpecName "kube-api-access-lwjps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:22:49 crc kubenswrapper[4873]: I0219 11:22:49.382016 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwjps\" (UniqueName: \"kubernetes.io/projected/55a0c77b-8e5a-4b18-8361-b672a9d394fb-kube-api-access-lwjps\") on node \"crc\" DevicePath \"\"" Feb 19 11:22:49 crc kubenswrapper[4873]: I0219 11:22:49.408894 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qhpxg/crc-debug-bpp7f"] Feb 19 11:22:49 crc kubenswrapper[4873]: I0219 11:22:49.417748 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qhpxg/crc-debug-bpp7f"] Feb 19 11:22:49 crc kubenswrapper[4873]: I0219 11:22:49.494390 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a0c77b-8e5a-4b18-8361-b672a9d394fb" path="/var/lib/kubelet/pods/55a0c77b-8e5a-4b18-8361-b672a9d394fb/volumes" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.055285 4873 scope.go:117] "RemoveContainer" containerID="c1a3ce9c18b56e83a8ab74c08f3f38d867ea9e51b90994bc784d9d1d2d171d1d" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.055331 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-bpp7f" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.637381 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qhpxg/crc-debug-dbqlz"] Feb 19 11:22:50 crc kubenswrapper[4873]: E0219 11:22:50.638082 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a0c77b-8e5a-4b18-8361-b672a9d394fb" containerName="container-00" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.638098 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a0c77b-8e5a-4b18-8361-b672a9d394fb" containerName="container-00" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.638346 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a0c77b-8e5a-4b18-8361-b672a9d394fb" containerName="container-00" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.639199 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.642469 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qhpxg"/"default-dockercfg-v9qxb" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.706917 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8f4j\" (UniqueName: \"kubernetes.io/projected/dcd95282-b63c-48c7-beaa-96e7112a6bd1-kube-api-access-v8f4j\") pod \"crc-debug-dbqlz\" (UID: \"dcd95282-b63c-48c7-beaa-96e7112a6bd1\") " pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.707003 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcd95282-b63c-48c7-beaa-96e7112a6bd1-host\") pod \"crc-debug-dbqlz\" (UID: \"dcd95282-b63c-48c7-beaa-96e7112a6bd1\") " pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.809214 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8f4j\" (UniqueName: \"kubernetes.io/projected/dcd95282-b63c-48c7-beaa-96e7112a6bd1-kube-api-access-v8f4j\") pod \"crc-debug-dbqlz\" (UID: \"dcd95282-b63c-48c7-beaa-96e7112a6bd1\") " pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.809304 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcd95282-b63c-48c7-beaa-96e7112a6bd1-host\") pod \"crc-debug-dbqlz\" (UID: \"dcd95282-b63c-48c7-beaa-96e7112a6bd1\") " pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.809449 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcd95282-b63c-48c7-beaa-96e7112a6bd1-host\") pod \"crc-debug-dbqlz\" (UID: \"dcd95282-b63c-48c7-beaa-96e7112a6bd1\") " pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.842175 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8f4j\" (UniqueName: \"kubernetes.io/projected/dcd95282-b63c-48c7-beaa-96e7112a6bd1-kube-api-access-v8f4j\") pod \"crc-debug-dbqlz\" (UID: \"dcd95282-b63c-48c7-beaa-96e7112a6bd1\") " pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" Feb 19 11:22:50 crc kubenswrapper[4873]: I0219 11:22:50.964994 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" Feb 19 11:22:51 crc kubenswrapper[4873]: I0219 11:22:51.070192 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" event={"ID":"dcd95282-b63c-48c7-beaa-96e7112a6bd1","Type":"ContainerStarted","Data":"dca455c21375b3dc46bc6a6702c107398fe7c7d9aada73a2a3406f6ae430a774"} Feb 19 11:22:52 crc kubenswrapper[4873]: I0219 11:22:52.081254 4873 generic.go:334] "Generic (PLEG): container finished" podID="dcd95282-b63c-48c7-beaa-96e7112a6bd1" containerID="44dd1810b190272782d7c29dbaf6a016c2c0e18db1ac287ddff7ea30dae394ce" exitCode=0 Feb 19 11:22:52 crc kubenswrapper[4873]: I0219 11:22:52.081377 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" event={"ID":"dcd95282-b63c-48c7-beaa-96e7112a6bd1","Type":"ContainerDied","Data":"44dd1810b190272782d7c29dbaf6a016c2c0e18db1ac287ddff7ea30dae394ce"} Feb 19 11:22:52 crc kubenswrapper[4873]: I0219 11:22:52.126017 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qhpxg/crc-debug-dbqlz"] Feb 19 11:22:52 crc kubenswrapper[4873]: I0219 11:22:52.135344 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qhpxg/crc-debug-dbqlz"] Feb 19 11:22:52 crc kubenswrapper[4873]: I0219 11:22:52.484168 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:22:52 crc kubenswrapper[4873]: E0219 11:22:52.484470 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:22:53 crc kubenswrapper[4873]: I0219 11:22:53.222811 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" Feb 19 11:22:53 crc kubenswrapper[4873]: I0219 11:22:53.267075 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8f4j\" (UniqueName: \"kubernetes.io/projected/dcd95282-b63c-48c7-beaa-96e7112a6bd1-kube-api-access-v8f4j\") pod \"dcd95282-b63c-48c7-beaa-96e7112a6bd1\" (UID: \"dcd95282-b63c-48c7-beaa-96e7112a6bd1\") " Feb 19 11:22:53 crc kubenswrapper[4873]: I0219 11:22:53.267259 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcd95282-b63c-48c7-beaa-96e7112a6bd1-host\") pod \"dcd95282-b63c-48c7-beaa-96e7112a6bd1\" (UID: \"dcd95282-b63c-48c7-beaa-96e7112a6bd1\") " Feb 19 11:22:53 crc kubenswrapper[4873]: I0219 11:22:53.267390 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dcd95282-b63c-48c7-beaa-96e7112a6bd1-host" (OuterVolumeSpecName: "host") pod "dcd95282-b63c-48c7-beaa-96e7112a6bd1" (UID: "dcd95282-b63c-48c7-beaa-96e7112a6bd1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 11:22:53 crc kubenswrapper[4873]: I0219 11:22:53.267808 4873 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcd95282-b63c-48c7-beaa-96e7112a6bd1-host\") on node \"crc\" DevicePath \"\"" Feb 19 11:22:53 crc kubenswrapper[4873]: I0219 11:22:53.286231 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd95282-b63c-48c7-beaa-96e7112a6bd1-kube-api-access-v8f4j" (OuterVolumeSpecName: "kube-api-access-v8f4j") pod "dcd95282-b63c-48c7-beaa-96e7112a6bd1" (UID: "dcd95282-b63c-48c7-beaa-96e7112a6bd1"). InnerVolumeSpecName "kube-api-access-v8f4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:22:53 crc kubenswrapper[4873]: I0219 11:22:53.369783 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8f4j\" (UniqueName: \"kubernetes.io/projected/dcd95282-b63c-48c7-beaa-96e7112a6bd1-kube-api-access-v8f4j\") on node \"crc\" DevicePath \"\"" Feb 19 11:22:53 crc kubenswrapper[4873]: I0219 11:22:53.497495 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd95282-b63c-48c7-beaa-96e7112a6bd1" path="/var/lib/kubelet/pods/dcd95282-b63c-48c7-beaa-96e7112a6bd1/volumes" Feb 19 11:22:54 crc kubenswrapper[4873]: I0219 11:22:54.101457 4873 scope.go:117] "RemoveContainer" containerID="44dd1810b190272782d7c29dbaf6a016c2c0e18db1ac287ddff7ea30dae394ce" Feb 19 11:22:54 crc kubenswrapper[4873]: I0219 11:22:54.101513 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/crc-debug-dbqlz" Feb 19 11:23:05 crc kubenswrapper[4873]: I0219 11:23:05.484047 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:23:05 crc kubenswrapper[4873]: E0219 11:23:05.484853 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:23:18 crc kubenswrapper[4873]: I0219 11:23:18.484129 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:23:18 crc kubenswrapper[4873]: E0219 11:23:18.484918 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:23:31 crc kubenswrapper[4873]: I0219 11:23:31.492639 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:23:31 crc kubenswrapper[4873]: E0219 11:23:31.493446 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:23:38 crc kubenswrapper[4873]: I0219 11:23:38.272369 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c4d59d6dd-4nh9w_76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3/barbican-api/0.log" Feb 19 11:23:38 crc kubenswrapper[4873]: I0219 11:23:38.427912 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c4d59d6dd-4nh9w_76cdfe98-7182-4ed8-8d4a-7472ed0dc7c3/barbican-api-log/0.log" Feb 19 11:23:38 crc kubenswrapper[4873]: I0219 11:23:38.473396 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-667444df98-tdgw9_9be5e1ee-a214-46ca-a5bf-d1d337848085/barbican-keystone-listener/0.log" Feb 19 11:23:38 crc kubenswrapper[4873]: I0219 11:23:38.556436 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-667444df98-tdgw9_9be5e1ee-a214-46ca-a5bf-d1d337848085/barbican-keystone-listener-log/0.log" Feb 19 11:23:38 crc kubenswrapper[4873]: I0219 11:23:38.653973 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-596d5556df-fx4q8_fc48b70c-5ab9-4765-a8cd-5985a3f63854/barbican-worker/0.log" Feb 19 11:23:38 crc kubenswrapper[4873]: I0219 11:23:38.988585 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-596d5556df-fx4q8_fc48b70c-5ab9-4765-a8cd-5985a3f63854/barbican-worker-log/0.log" Feb 19 11:23:39 crc kubenswrapper[4873]: I0219 11:23:39.239662 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-j9s9r_fb8aa6eb-a92d-47ab-803f-664399242dde/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:39 crc kubenswrapper[4873]: I0219 11:23:39.318969 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e432fa6f-daf1-4f3a-9f84-ac9495956013/ceilometer-notification-agent/0.log" Feb 19 11:23:39 crc kubenswrapper[4873]: I0219 11:23:39.329908 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e432fa6f-daf1-4f3a-9f84-ac9495956013/ceilometer-central-agent/0.log" Feb 19 11:23:39 crc kubenswrapper[4873]: I0219 11:23:39.478372 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e432fa6f-daf1-4f3a-9f84-ac9495956013/proxy-httpd/0.log" Feb 19 11:23:39 crc kubenswrapper[4873]: I0219 11:23:39.488090 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e432fa6f-daf1-4f3a-9f84-ac9495956013/sg-core/0.log" Feb 19 11:23:39 crc kubenswrapper[4873]: I0219 11:23:39.805385 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f3dabe51-c676-42bb-936a-d784ee2e565a/cinder-api-log/0.log" Feb 19 11:23:40 crc kubenswrapper[4873]: I0219 11:23:40.093566 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_312e766d-4086-4bab-bf8f-9a154f1da5b5/probe/0.log" Feb 19 11:23:40 crc kubenswrapper[4873]: I0219 11:23:40.131333 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_312e766d-4086-4bab-bf8f-9a154f1da5b5/cinder-backup/0.log" Feb 19 11:23:40 crc kubenswrapper[4873]: I0219 11:23:40.322905 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f3dabe51-c676-42bb-936a-d784ee2e565a/cinder-api/0.log" Feb 19 11:23:40 crc kubenswrapper[4873]: I0219 11:23:40.327280 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1/cinder-scheduler/0.log" Feb 19 11:23:40 crc kubenswrapper[4873]: I0219 11:23:40.419028 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cd9b32e6-4f78-4f9c-9fbd-e91b37d110a1/probe/0.log" Feb 19 11:23:40 crc kubenswrapper[4873]: I0219 11:23:40.556566 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_717b3122-e7c6-4cbe-8528-4b582dd7adc5/probe/0.log" Feb 19 11:23:40 crc kubenswrapper[4873]: I0219 11:23:40.853318 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_717b3122-e7c6-4cbe-8528-4b582dd7adc5/cinder-volume/0.log" Feb 19 11:23:40 crc kubenswrapper[4873]: I0219 11:23:40.856238 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_8268173a-e7be-4edd-a1e8-bed3486b138e/probe/0.log" Feb 19 11:23:40 crc kubenswrapper[4873]: I0219 11:23:40.901617 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_8268173a-e7be-4edd-a1e8-bed3486b138e/cinder-volume/0.log" Feb 19 11:23:41 crc kubenswrapper[4873]: I0219 11:23:41.090599 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-snp5b_f0739ccd-765a-42c4-89b4-de6adf188e24/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:41 crc kubenswrapper[4873]: I0219 11:23:41.172825 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kzrd2_40ec1f13-0b91-4c7c-a13e-11e60f55e627/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:41 crc kubenswrapper[4873]: I0219 11:23:41.291412 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c564b89cf-9v87f_20253d93-eafe-45db-b11e-338714ffd978/init/0.log" Feb 19 11:23:41 crc kubenswrapper[4873]: I0219 11:23:41.519114 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c564b89cf-9v87f_20253d93-eafe-45db-b11e-338714ffd978/init/0.log" Feb 19 11:23:41 crc kubenswrapper[4873]: I0219 11:23:41.526260 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2s7zj_ab7d5a49-ac61-4963-8766-1716098f3d4c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:41 crc kubenswrapper[4873]: I0219 11:23:41.731940 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c564b89cf-9v87f_20253d93-eafe-45db-b11e-338714ffd978/dnsmasq-dns/0.log" Feb 19 11:23:41 crc kubenswrapper[4873]: I0219 11:23:41.823906 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_09cfd898-398f-41ae-8c45-1ed215b69683/glance-httpd/0.log" Feb 19 11:23:41 crc kubenswrapper[4873]: I0219 11:23:41.849948 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_09cfd898-398f-41ae-8c45-1ed215b69683/glance-log/0.log" Feb 19 11:23:42 crc kubenswrapper[4873]: I0219 11:23:42.056501 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0df7963-e78f-457c-a27f-45c26232cfa7/glance-httpd/0.log" Feb 19 11:23:42 crc kubenswrapper[4873]: I0219 11:23:42.084468 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0df7963-e78f-457c-a27f-45c26232cfa7/glance-log/0.log" Feb 19 11:23:42 crc kubenswrapper[4873]: I0219 11:23:42.321171 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6687d9896d-v96j2_fa527f64-6e38-48c2-9927-a319f4579070/horizon/0.log" Feb 19 11:23:42 crc kubenswrapper[4873]: I0219 11:23:42.375259 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-2zqn6_537c2ac8-0912-4609-ab4e-760060a78d52/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:42 crc kubenswrapper[4873]: I0219 11:23:42.605667 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-s2jwj_4b127e45-b09c-4e11-9423-58f1f51effd4/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:43 crc kubenswrapper[4873]: I0219 11:23:43.084742 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6687d9896d-v96j2_fa527f64-6e38-48c2-9927-a319f4579070/horizon-log/0.log" Feb 19 11:23:43 crc kubenswrapper[4873]: I0219 11:23:43.198587 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29524981-pxsmx_3f08f0c4-870d-4d9a-8a82-ce22827ce779/keystone-cron/0.log" Feb 19 11:23:43 crc kubenswrapper[4873]: I0219 11:23:43.224834 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5fcd445c48-xvpw4_ed86f09e-909d-451b-96c0-9b4b7b27eb03/keystone-api/0.log" Feb 19 11:23:43 crc kubenswrapper[4873]: I0219 11:23:43.479219 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_84c63c73-45f3-4d27-a3a3-cbfecd9e1810/kube-state-metrics/0.log" Feb 19 11:23:43 crc kubenswrapper[4873]: I0219 11:23:43.532608 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-z5kkf_2baa296e-fb37-4d90-a7e4-68f61006e085/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:44 crc kubenswrapper[4873]: I0219 11:23:44.019905 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-988bb_a607f592-ebca-4bf5-9e98-f9e2bc131ff1/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:44 crc kubenswrapper[4873]: I0219 11:23:44.038703 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76cc4fb9fc-vdfd4_f168d086-aaa7-4a6e-9a65-5ab28e10a7e8/neutron-httpd/0.log" Feb 19 11:23:44 crc kubenswrapper[4873]: I0219 11:23:44.179310 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76cc4fb9fc-vdfd4_f168d086-aaa7-4a6e-9a65-5ab28e10a7e8/neutron-api/0.log" Feb 19 11:23:44 crc kubenswrapper[4873]: I0219 11:23:44.295715 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_da89f0ff-c51c-4c4a-8df4-f7787d29ddd2/setup-container/0.log" Feb 19 11:23:44 crc kubenswrapper[4873]: I0219 11:23:44.582287 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_da89f0ff-c51c-4c4a-8df4-f7787d29ddd2/setup-container/0.log" Feb 19 11:23:44 crc kubenswrapper[4873]: I0219 11:23:44.621055 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_da89f0ff-c51c-4c4a-8df4-f7787d29ddd2/rabbitmq/0.log" Feb 19 11:23:45 crc kubenswrapper[4873]: I0219 11:23:45.395187 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c25b9f1f-0533-4e00-a926-08639b1b2266/nova-cell0-conductor-conductor/0.log" Feb 19 11:23:45 crc kubenswrapper[4873]: I0219 11:23:45.483893 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:23:45 crc kubenswrapper[4873]: E0219 11:23:45.484268 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:23:45 crc kubenswrapper[4873]: I0219 11:23:45.757196 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0688136a-f0b5-4a2a-8f08-9c99d9c3644c/nova-cell1-conductor-conductor/0.log" Feb 19 11:23:46 crc kubenswrapper[4873]: I0219 11:23:46.108525 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_cf46452a-f49d-48ab-a235-9e96f89c931f/nova-cell1-novncproxy-novncproxy/0.log" Feb 19 11:23:46 crc kubenswrapper[4873]: I0219 11:23:46.428172 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4f4e613e-0a31-4191-9afb-4fd0300586f9/nova-api-log/0.log" Feb 19 11:23:46 crc kubenswrapper[4873]: I0219 11:23:46.445006 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-v25t6_ce5f426d-554a-469a-be1e-e3e1b9bfa68e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:46 crc kubenswrapper[4873]: I0219 11:23:46.762313 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15cbab3c-9843-4bf6-b0e8-b65dec1e5112/nova-metadata-log/0.log" Feb 19 11:23:46 crc kubenswrapper[4873]: I0219 11:23:46.930403 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4f4e613e-0a31-4191-9afb-4fd0300586f9/nova-api-api/0.log" Feb 19 11:23:47 crc kubenswrapper[4873]: I0219 11:23:47.419779 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3385c22-baa0-4261-b498-6a09c8768520/mysql-bootstrap/0.log" Feb 19 11:23:47 crc kubenswrapper[4873]: I0219 11:23:47.479018 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_adb0395e-00f8-4bc6-a0a6-2b956235c58c/nova-scheduler-scheduler/0.log" Feb 19 11:23:47 crc kubenswrapper[4873]: I0219 11:23:47.683653 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3385c22-baa0-4261-b498-6a09c8768520/mysql-bootstrap/0.log" Feb 19 11:23:47 crc kubenswrapper[4873]: I0219 11:23:47.755206 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3385c22-baa0-4261-b498-6a09c8768520/galera/0.log" Feb 19 11:23:47 crc kubenswrapper[4873]: I0219 11:23:47.907852 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964/mysql-bootstrap/0.log" Feb 19 11:23:48 crc kubenswrapper[4873]: I0219 11:23:48.101299 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964/mysql-bootstrap/0.log" Feb 19 11:23:48 crc kubenswrapper[4873]: I0219 11:23:48.123705 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f1b4e4e4-15bf-4c4d-b7c4-bc3029c32964/galera/0.log" Feb 19 11:23:48 crc kubenswrapper[4873]: I0219 11:23:48.380235 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5c4eb2b5-d272-49ff-938e-3e3359d29f46/openstackclient/0.log" Feb 19 11:23:48 crc kubenswrapper[4873]: I0219 11:23:48.449935 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-djxfb_888c3336-cd8a-4bf2-805f-6b473fb272f4/openstack-network-exporter/0.log" Feb 19 11:23:48 crc kubenswrapper[4873]: I0219 11:23:48.599672 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5bgp_de2f2331-fc83-420b-9e1b-fe08998cb0ab/ovsdb-server-init/0.log" Feb 19 11:23:48 crc kubenswrapper[4873]: I0219 11:23:48.860954 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5bgp_de2f2331-fc83-420b-9e1b-fe08998cb0ab/ovsdb-server-init/0.log" Feb 19 11:23:48 crc kubenswrapper[4873]: I0219 11:23:48.870677 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5bgp_de2f2331-fc83-420b-9e1b-fe08998cb0ab/ovsdb-server/0.log" Feb 19 11:23:49 crc kubenswrapper[4873]: I0219 11:23:49.107912 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vsnt5_b0ab9d21-0c11-4940-ad43-3e20c46012ad/ovn-controller/0.log" Feb 19 11:23:49 crc kubenswrapper[4873]: I0219 11:23:49.283639 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t5bgp_de2f2331-fc83-420b-9e1b-fe08998cb0ab/ovs-vswitchd/0.log" Feb 19 11:23:49 crc kubenswrapper[4873]: I0219 11:23:49.370742 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15cbab3c-9843-4bf6-b0e8-b65dec1e5112/nova-metadata-metadata/0.log" Feb 19 11:23:49 crc kubenswrapper[4873]: I0219 11:23:49.408317 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-dks5c_f5d576b5-56dd-4f9f-b67b-0ee87213ea78/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:49 crc kubenswrapper[4873]: I0219 11:23:49.540739 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bd6df8e5-8bc5-4bd5-b466-a90642932cc2/openstack-network-exporter/0.log" Feb 19 11:23:49 crc kubenswrapper[4873]: I0219 11:23:49.661981 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bd6df8e5-8bc5-4bd5-b466-a90642932cc2/ovn-northd/0.log" Feb 19 11:23:49 crc kubenswrapper[4873]: I0219 11:23:49.752380 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4574f6e3-d697-424c-a9f1-7b74afb82324/openstack-network-exporter/0.log" Feb 19 11:23:49 crc kubenswrapper[4873]: I0219 11:23:49.910692 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4574f6e3-d697-424c-a9f1-7b74afb82324/ovsdbserver-nb/0.log" Feb 19 11:23:49 crc kubenswrapper[4873]: I0219 11:23:49.910923 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_877efa5f-4357-4396-8805-729237cd4e8f/openstack-network-exporter/0.log" Feb 19 11:23:50 crc kubenswrapper[4873]: I0219 11:23:50.009914 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_877efa5f-4357-4396-8805-729237cd4e8f/ovsdbserver-sb/0.log" Feb 19 11:23:50 crc kubenswrapper[4873]: I0219 11:23:50.362084 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6696d67b98-wrvnm_c5d4dde9-793b-403e-8701-84cca6a509e1/placement-api/0.log" Feb 19 11:23:50 crc kubenswrapper[4873]: I0219 11:23:50.419867 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ae630a8f-ee42-4f96-adb9-d18bf713af37/init-config-reloader/0.log" Feb 19 11:23:50 crc kubenswrapper[4873]: I0219 11:23:50.442945 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6696d67b98-wrvnm_c5d4dde9-793b-403e-8701-84cca6a509e1/placement-log/0.log" Feb 19 11:23:50 crc kubenswrapper[4873]: I0219 11:23:50.643208 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ae630a8f-ee42-4f96-adb9-d18bf713af37/config-reloader/0.log" Feb 19 11:23:50 crc kubenswrapper[4873]: I0219 11:23:50.646215 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ae630a8f-ee42-4f96-adb9-d18bf713af37/init-config-reloader/0.log" Feb 19 11:23:50 crc kubenswrapper[4873]: I0219 11:23:50.648918 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ae630a8f-ee42-4f96-adb9-d18bf713af37/prometheus/0.log" Feb 19 11:23:50 crc kubenswrapper[4873]: I0219 11:23:50.735031 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ae630a8f-ee42-4f96-adb9-d18bf713af37/thanos-sidecar/0.log" Feb 19 11:23:50 crc kubenswrapper[4873]: I0219 11:23:50.870803 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1150426f-909f-4b05-b216-ccf29f7039eb/setup-container/0.log" Feb 19 11:23:51 crc kubenswrapper[4873]: I0219 11:23:51.136706 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1150426f-909f-4b05-b216-ccf29f7039eb/setup-container/0.log" Feb 19 11:23:51 crc kubenswrapper[4873]: I0219 11:23:51.184799 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1150426f-909f-4b05-b216-ccf29f7039eb/rabbitmq/0.log" Feb 19 11:23:51 crc kubenswrapper[4873]: I0219 11:23:51.206817 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d564a6d4-4702-4e96-b814-8d9f01db02e5/setup-container/0.log" Feb 19 11:23:51 crc kubenswrapper[4873]: I0219 11:23:51.394875 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d564a6d4-4702-4e96-b814-8d9f01db02e5/setup-container/0.log" Feb 19 11:23:51 crc kubenswrapper[4873]: I0219 11:23:51.428236 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-nw9vj_157ee933-b692-4c92-bcbd-967bc1cd377c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:51 crc kubenswrapper[4873]: I0219 11:23:51.493727 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d564a6d4-4702-4e96-b814-8d9f01db02e5/rabbitmq/0.log" Feb 19 11:23:51 crc kubenswrapper[4873]: I0219 11:23:51.554067 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_21bb5d7d-6565-484a-af2d-0edcff2729b3/memcached/0.log" Feb 19 11:23:51 crc kubenswrapper[4873]: I0219 11:23:51.928217 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-mt2n6_3ba1c3b5-6b1a-4d7e-bbdd-fb492abd6647/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:51 crc kubenswrapper[4873]: I0219 11:23:51.954372 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mxmpn_fda37ba3-82f5-4d49-a15f-4dca53649ec7/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.098528 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5wvjf_7843f72c-5559-44d6-86e0-62f013e0a073/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.164206 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-sw8hj_15999617-f2b4-4a3f-911d-422db799fa37/ssh-known-hosts-edpm-deployment/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.386090 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7c6d694569-qbpxm_d51beb70-e455-4e75-9e06-863b41fbf9a8/proxy-server/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.419769 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7c6d694569-qbpxm_d51beb70-e455-4e75-9e06-863b41fbf9a8/proxy-httpd/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.449170 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mx6qq_91fbca18-847d-4e7b-8a40-e52dd348d155/swift-ring-rebalance/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.617617 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/account-reaper/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.643515 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/account-auditor/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.646506 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/account-replicator/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.665011 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/account-server/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.687545 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/container-auditor/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.901859 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/object-auditor/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.904266 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/container-server/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.907658 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/container-updater/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.912521 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/container-replicator/0.log" Feb 19 11:23:52 crc kubenswrapper[4873]: I0219 11:23:52.962844 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/object-expirer/0.log" Feb 19 11:23:53 crc kubenswrapper[4873]: I0219 11:23:53.129535 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/object-server/0.log" Feb 19 11:23:53 crc kubenswrapper[4873]: I0219 11:23:53.133645 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/rsync/0.log" Feb 19 11:23:53 crc kubenswrapper[4873]: I0219 11:23:53.138196 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/object-updater/0.log" Feb 19 11:23:53 crc kubenswrapper[4873]: I0219 11:23:53.156308 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/object-replicator/0.log" Feb 19 11:23:53 crc kubenswrapper[4873]: I0219 11:23:53.185715 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c3b21a02-7162-42ca-84cf-e0fa36b04a22/swift-recon-cron/0.log" Feb 19 11:23:53 crc kubenswrapper[4873]: I0219 11:23:53.358807 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4bwqz_bf143721-2963-4009-8e23-0c283b4a88a3/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:53 crc kubenswrapper[4873]: I0219 11:23:53.388460 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5e5a79da-a068-4a68-ba79-6719ea0fb353/tempest-tests-tempest-tests-runner/0.log" Feb 19 11:23:53 crc kubenswrapper[4873]: I0219 11:23:53.526127 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_58738a83-0734-4889-9b0e-650e43f6dbb7/test-operator-logs-container/0.log" Feb 19 11:23:53 crc kubenswrapper[4873]: I0219 11:23:53.629863 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-2nnwh_28f40398-582f-40ed-92b8-2ff5a19d138d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 11:23:54 crc kubenswrapper[4873]: I0219 11:23:54.319151 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_3d0e231c-7848-4f57-a28b-dfec3c87b617/watcher-applier/0.log" Feb 19 11:23:55 crc kubenswrapper[4873]: I0219 11:23:55.034593 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_9fb835f9-7ac4-4212-a372-b793c2fb8afd/watcher-api-log/0.log" Feb 19 11:23:56 crc kubenswrapper[4873]: I0219 11:23:56.483991 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:23:56 crc kubenswrapper[4873]: E0219 11:23:56.484302 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:23:57 crc kubenswrapper[4873]: I0219 11:23:57.415626 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_3ecf8671-28f5-4549-a4c1-0cdad8421837/watcher-decision-engine/0.log" Feb 19 11:23:58 crc kubenswrapper[4873]: I0219 11:23:58.343450 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_9fb835f9-7ac4-4212-a372-b793c2fb8afd/watcher-api/0.log" Feb 19 11:24:09 crc kubenswrapper[4873]: I0219 11:24:09.488462 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:24:09 crc kubenswrapper[4873]: E0219 11:24:09.489199 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:24:21 crc kubenswrapper[4873]: I0219 11:24:21.490985 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:24:21 crc kubenswrapper[4873]: E0219 11:24:21.491857 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:24:22 crc kubenswrapper[4873]: I0219 11:24:22.884620 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/util/0.log" Feb 19 11:24:23 crc kubenswrapper[4873]: I0219 11:24:23.060701 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/util/0.log" Feb 19 11:24:23 crc kubenswrapper[4873]: I0219 11:24:23.075071 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/pull/0.log" Feb 19 11:24:23 crc kubenswrapper[4873]: I0219 11:24:23.082497 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/pull/0.log" Feb 19 11:24:23 crc kubenswrapper[4873]: I0219 11:24:23.340745 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/pull/0.log" Feb 19 11:24:23 crc kubenswrapper[4873]: I0219 11:24:23.363305 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/extract/0.log" Feb 19 11:24:23 crc kubenswrapper[4873]: I0219 11:24:23.395621 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8e5457509d36cb248fb3a6842d85c029cfff46a5712e7fc5aa077e58cemtmp6_78582e6c-dedc-4608-a542-6837184954ab/util/0.log" Feb 19 11:24:24 crc kubenswrapper[4873]: I0219 11:24:24.161136 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-t54x9_f108f6ea-4506-48bf-b948-e367078c3dce/manager/0.log" Feb 19 11:24:24 crc kubenswrapper[4873]: I0219 11:24:24.553036 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-vgxsl_43531003-74d3-43b9-b0f5-6fca42b21975/manager/0.log" Feb 19 11:24:24 crc kubenswrapper[4873]: I0219 11:24:24.624834 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-vwx5n_8d4b6c84-e5ed-4761-b7c7-95b21da856f7/manager/0.log" Feb 19 11:24:24 crc kubenswrapper[4873]: I0219 11:24:24.892554 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-r9b5b_2b1c8872-b310-4994-819c-a8e472d8e522/manager/0.log" Feb 19 11:24:25 crc kubenswrapper[4873]: I0219 11:24:25.521428 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-f86jr_aeccf47e-b953-4036-b271-be284b9ab385/manager/0.log" Feb 19 11:24:25 crc kubenswrapper[4873]: I0219 11:24:25.797318 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-4t46s_3ff0155f-08fd-42f5-9b31-c3b9a7cefefe/manager/0.log" Feb 19 11:24:26 crc kubenswrapper[4873]: I0219 11:24:26.054228 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-t7mwr_ecf3484a-026e-4655-bfa8-e5292e2f62c5/manager/0.log" Feb 19 11:24:26 crc kubenswrapper[4873]: I0219 11:24:26.284972 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-t2hfl_e4172fa9-b04e-4894-82d6-ec65ea92b004/manager/0.log" Feb 19 11:24:26 crc kubenswrapper[4873]: I0219 11:24:26.513217 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-8v7q6_588098b3-662f-4f6f-914c-8cb28e055ccd/manager/0.log" Feb 19 11:24:26 crc kubenswrapper[4873]: I0219 11:24:26.803686 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-cx7xf_2e7ca3f2-f73b-4bac-93bb-68b2518d956e/manager/0.log" Feb 19 11:24:26 crc kubenswrapper[4873]: I0219 11:24:26.832255 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-d6h72_c471d099-fa02-4463-9eb9-9d0f6a3832e6/manager/0.log" Feb 19 11:24:27 crc kubenswrapper[4873]: I0219 11:24:27.180658 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-n6djt_8eec8859-f388-4d81-bbce-0433a66a1ef7/manager/0.log" Feb 19 11:24:27 crc kubenswrapper[4873]: I0219 11:24:27.425328 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cr64hv_515c6c0c-ae00-4ae1-ab3f-e22e5a585681/manager/0.log" Feb 19 11:24:27 crc kubenswrapper[4873]: I0219 11:24:27.669467 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-8476bb6847-rv4sx_e18b6851-e022-488e-bd95-27d1659f2761/operator/0.log" Feb 19 11:24:27 crc kubenswrapper[4873]: I0219 11:24:27.931778 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-p62rb_0144fe1c-ef13-4b4e-8cda-ddc72e2516bb/registry-server/0.log" Feb 19 11:24:28 crc kubenswrapper[4873]: I0219 11:24:28.244157 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-db4dr_dc53742c-7e71-49fa-9378-b26036c80275/manager/0.log" Feb 19 11:24:28 crc kubenswrapper[4873]: I0219 11:24:28.487672 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-6hpwv_74e9952e-50ef-4389-aa77-8f6e9cc790a8/manager/0.log" Feb 19 11:24:28 crc kubenswrapper[4873]: I0219 11:24:28.708705 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-lcnz4_9574bff7-0aac-4a24-b69f-135ff968422e/operator/0.log" Feb 19 11:24:28 crc kubenswrapper[4873]: I0219 11:24:28.963892 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-r74rt_1f098ace-bbc4-46ee-8e72-ab65a59851eb/manager/0.log" Feb 19 11:24:29 crc kubenswrapper[4873]: I0219 11:24:29.521021 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-2szzj_e139553a-a68d-424d-95b5-9093ea05440b/manager/0.log" Feb 19 11:24:29 crc kubenswrapper[4873]: I0219 11:24:29.546235 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-g22tc_0e9da99c-56ee-4353-9378-c59a2c4e1608/manager/0.log" Feb 19 11:24:29 crc kubenswrapper[4873]: I0219 11:24:29.964443 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7d767c64df-hld6w_e827e28d-ffd8-4f59-82bf-a6db1dab5413/manager/0.log" Feb 19 11:24:30 crc kubenswrapper[4873]: I0219 11:24:30.136337 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77c7c45f98-q8khx_26f0a6ea-18fb-411a-b193-83938a4bbe19/manager/0.log" Feb 19 11:24:30 crc kubenswrapper[4873]: I0219 11:24:30.200958 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-t9kgf_080befba-c501-4f84-8644-6b9fda0d8d5f/manager/0.log" Feb 19 11:24:33 crc kubenswrapper[4873]: I0219 11:24:33.483708 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:24:33 crc kubenswrapper[4873]: E0219 11:24:33.484552 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:24:36 crc kubenswrapper[4873]: I0219 11:24:36.353138 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-hqmvw_d53d2bae-fcdd-408c-9950-440e841cc035/manager/0.log" Feb 19 11:24:45 crc kubenswrapper[4873]: I0219 11:24:45.484973 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:24:45 crc kubenswrapper[4873]: E0219 11:24:45.485823 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:24:50 crc kubenswrapper[4873]: I0219 11:24:50.926399 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-s67xb_d639ff25-343e-4e7c-bd2e-f5fc533923f4/control-plane-machine-set-operator/0.log" Feb 19 11:24:51 crc kubenswrapper[4873]: I0219 11:24:51.435720 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-k627b_df659e7d-39ab-41ee-8df5-08896976666c/machine-api-operator/0.log" Feb 19 11:24:51 crc kubenswrapper[4873]: I0219 11:24:51.463247 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-k627b_df659e7d-39ab-41ee-8df5-08896976666c/kube-rbac-proxy/0.log" Feb 19 11:24:58 crc kubenswrapper[4873]: I0219 11:24:58.484051 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:24:58 crc kubenswrapper[4873]: E0219 11:24:58.484851 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:25:04 crc kubenswrapper[4873]: I0219 11:25:04.702716 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-ckd42_51fc361b-11a5-480a-a5b9-0eb4b7670e83/cert-manager-controller/0.log" Feb 19 11:25:04 crc kubenswrapper[4873]: I0219 11:25:04.883812 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-zhqgv_084c90b4-3270-4f64-8c8c-1a96f05dc1fa/cert-manager-cainjector/0.log" Feb 19 11:25:04 crc kubenswrapper[4873]: I0219 11:25:04.891130 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-fhd9c_2eebe311-368b-45b4-9e74-7442221e3785/cert-manager-webhook/0.log" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.524292 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vh782"] Feb 19 11:25:07 crc kubenswrapper[4873]: E0219 11:25:07.525137 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd95282-b63c-48c7-beaa-96e7112a6bd1" containerName="container-00" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.525153 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd95282-b63c-48c7-beaa-96e7112a6bd1" containerName="container-00" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.525396 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd95282-b63c-48c7-beaa-96e7112a6bd1" containerName="container-00" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.527073 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.549944 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vh782"] Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.603949 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdqsh\" (UniqueName: \"kubernetes.io/projected/2f11d216-e951-49a8-9728-9348dd3e09ab-kube-api-access-wdqsh\") pod \"redhat-operators-vh782\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.604058 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-catalog-content\") pod \"redhat-operators-vh782\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.604234 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-utilities\") pod \"redhat-operators-vh782\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.706471 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-utilities\") pod \"redhat-operators-vh782\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.706570 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdqsh\" (UniqueName: \"kubernetes.io/projected/2f11d216-e951-49a8-9728-9348dd3e09ab-kube-api-access-wdqsh\") pod \"redhat-operators-vh782\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.706634 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-catalog-content\") pod \"redhat-operators-vh782\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.707067 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-catalog-content\") pod \"redhat-operators-vh782\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.707191 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-utilities\") pod \"redhat-operators-vh782\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.724548 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdqsh\" (UniqueName: \"kubernetes.io/projected/2f11d216-e951-49a8-9728-9348dd3e09ab-kube-api-access-wdqsh\") pod \"redhat-operators-vh782\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:07 crc kubenswrapper[4873]: I0219 11:25:07.862049 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:08 crc kubenswrapper[4873]: I0219 11:25:08.358992 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vh782"] Feb 19 11:25:08 crc kubenswrapper[4873]: I0219 11:25:08.447780 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh782" event={"ID":"2f11d216-e951-49a8-9728-9348dd3e09ab","Type":"ContainerStarted","Data":"b05d7372ec51e0442e4648a2dc4cd081b94c1b9c9a02e0ea0e267bcfe4e90abb"} Feb 19 11:25:09 crc kubenswrapper[4873]: I0219 11:25:09.459987 4873 generic.go:334] "Generic (PLEG): container finished" podID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerID="edb548abbd6f9dd1cc88a622635559640a64f2ec1f91e50fae7650e337dd57f4" exitCode=0 Feb 19 11:25:09 crc kubenswrapper[4873]: I0219 11:25:09.460029 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh782" event={"ID":"2f11d216-e951-49a8-9728-9348dd3e09ab","Type":"ContainerDied","Data":"edb548abbd6f9dd1cc88a622635559640a64f2ec1f91e50fae7650e337dd57f4"} Feb 19 11:25:10 crc kubenswrapper[4873]: I0219 11:25:10.922184 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sxbs9"] Feb 19 11:25:10 crc kubenswrapper[4873]: I0219 11:25:10.924496 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:10 crc kubenswrapper[4873]: I0219 11:25:10.944596 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxbs9"] Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.097745 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9p5p\" (UniqueName: \"kubernetes.io/projected/e39282f2-483a-457f-9a81-ed6faf0794a2-kube-api-access-r9p5p\") pod \"redhat-marketplace-sxbs9\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.098172 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-utilities\") pod \"redhat-marketplace-sxbs9\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.098195 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-catalog-content\") pod \"redhat-marketplace-sxbs9\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.199471 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9p5p\" (UniqueName: \"kubernetes.io/projected/e39282f2-483a-457f-9a81-ed6faf0794a2-kube-api-access-r9p5p\") pod \"redhat-marketplace-sxbs9\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.199675 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-utilities\") pod \"redhat-marketplace-sxbs9\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.199696 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-catalog-content\") pod \"redhat-marketplace-sxbs9\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.200235 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-catalog-content\") pod \"redhat-marketplace-sxbs9\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.200687 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-utilities\") pod \"redhat-marketplace-sxbs9\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.223128 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9p5p\" (UniqueName: \"kubernetes.io/projected/e39282f2-483a-457f-9a81-ed6faf0794a2-kube-api-access-r9p5p\") pod \"redhat-marketplace-sxbs9\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.247294 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.516085 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh782" event={"ID":"2f11d216-e951-49a8-9728-9348dd3e09ab","Type":"ContainerStarted","Data":"4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55"} Feb 19 11:25:11 crc kubenswrapper[4873]: I0219 11:25:11.799359 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxbs9"] Feb 19 11:25:12 crc kubenswrapper[4873]: I0219 11:25:12.484266 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:25:12 crc kubenswrapper[4873]: E0219 11:25:12.484944 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:25:12 crc kubenswrapper[4873]: I0219 11:25:12.513661 4873 generic.go:334] "Generic (PLEG): container finished" podID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerID="5f164c85151d9a6b6ad768d870ce77d344a5947bf1dcc7d03e716b30ff844993" exitCode=0 Feb 19 11:25:12 crc kubenswrapper[4873]: I0219 11:25:12.513711 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxbs9" event={"ID":"e39282f2-483a-457f-9a81-ed6faf0794a2","Type":"ContainerDied","Data":"5f164c85151d9a6b6ad768d870ce77d344a5947bf1dcc7d03e716b30ff844993"} Feb 19 11:25:12 crc kubenswrapper[4873]: I0219 11:25:12.513758 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxbs9" event={"ID":"e39282f2-483a-457f-9a81-ed6faf0794a2","Type":"ContainerStarted","Data":"6d80a83b03b357f5f4bbd1bca810d6a76cb6aaf294b677f996036db6833706e1"} Feb 19 11:25:14 crc kubenswrapper[4873]: I0219 11:25:14.534709 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxbs9" event={"ID":"e39282f2-483a-457f-9a81-ed6faf0794a2","Type":"ContainerStarted","Data":"c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af"} Feb 19 11:25:16 crc kubenswrapper[4873]: I0219 11:25:16.560928 4873 generic.go:334] "Generic (PLEG): container finished" podID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerID="c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af" exitCode=0 Feb 19 11:25:16 crc kubenswrapper[4873]: I0219 11:25:16.560991 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxbs9" event={"ID":"e39282f2-483a-457f-9a81-ed6faf0794a2","Type":"ContainerDied","Data":"c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af"} Feb 19 11:25:17 crc kubenswrapper[4873]: I0219 11:25:17.573720 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxbs9" event={"ID":"e39282f2-483a-457f-9a81-ed6faf0794a2","Type":"ContainerStarted","Data":"a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da"} Feb 19 11:25:17 crc kubenswrapper[4873]: I0219 11:25:17.600117 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sxbs9" podStartSLOduration=3.148563052 podStartE2EDuration="7.600078902s" podCreationTimestamp="2026-02-19 11:25:10 +0000 UTC" firstStartedPulling="2026-02-19 11:25:12.515746077 +0000 UTC m=+6021.805177735" lastFinishedPulling="2026-02-19 11:25:16.967261947 +0000 UTC m=+6026.256693585" observedRunningTime="2026-02-19 11:25:17.59763577 +0000 UTC m=+6026.887067408" watchObservedRunningTime="2026-02-19 11:25:17.600078902 +0000 UTC m=+6026.889510540" Feb 19 11:25:18 crc kubenswrapper[4873]: I0219 11:25:18.600230 4873 generic.go:334] "Generic (PLEG): container finished" podID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerID="4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55" exitCode=0 Feb 19 11:25:18 crc kubenswrapper[4873]: I0219 11:25:18.600280 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh782" event={"ID":"2f11d216-e951-49a8-9728-9348dd3e09ab","Type":"ContainerDied","Data":"4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55"} Feb 19 11:25:19 crc kubenswrapper[4873]: I0219 11:25:19.565933 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-9cr2m_9b3c6348-1c17-4774-9739-7a1dd3021d81/nmstate-console-plugin/0.log" Feb 19 11:25:19 crc kubenswrapper[4873]: I0219 11:25:19.615435 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh782" event={"ID":"2f11d216-e951-49a8-9728-9348dd3e09ab","Type":"ContainerStarted","Data":"5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819"} Feb 19 11:25:19 crc kubenswrapper[4873]: I0219 11:25:19.640185 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vh782" podStartSLOduration=3.088485794 podStartE2EDuration="12.640164627s" podCreationTimestamp="2026-02-19 11:25:07 +0000 UTC" firstStartedPulling="2026-02-19 11:25:09.461931018 +0000 UTC m=+6018.751362656" lastFinishedPulling="2026-02-19 11:25:19.013609851 +0000 UTC m=+6028.303041489" observedRunningTime="2026-02-19 11:25:19.635485489 +0000 UTC m=+6028.924917127" watchObservedRunningTime="2026-02-19 11:25:19.640164627 +0000 UTC m=+6028.929596265" Feb 19 11:25:19 crc kubenswrapper[4873]: I0219 11:25:19.856371 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-75txf_62408ce4-73ce-4726-91c1-96f645c39dee/nmstate-handler/0.log" Feb 19 11:25:19 crc kubenswrapper[4873]: I0219 11:25:19.974278 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8jgss_3b960434-ef37-45ae-aa50-8d719c8e2df5/kube-rbac-proxy/0.log" Feb 19 11:25:20 crc kubenswrapper[4873]: I0219 11:25:20.146540 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-8jgss_3b960434-ef37-45ae-aa50-8d719c8e2df5/nmstate-metrics/0.log" Feb 19 11:25:20 crc kubenswrapper[4873]: I0219 11:25:20.183511 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-qlgxw_f7f28c8a-4571-485c-96a2-fc1c5856e3ea/nmstate-operator/0.log" Feb 19 11:25:20 crc kubenswrapper[4873]: I0219 11:25:20.371235 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-nfh8w_7af074a2-c1f7-4253-8efc-065748e0452b/nmstate-webhook/0.log" Feb 19 11:25:21 crc kubenswrapper[4873]: I0219 11:25:21.248137 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:21 crc kubenswrapper[4873]: I0219 11:25:21.248466 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:22 crc kubenswrapper[4873]: I0219 11:25:22.307847 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-sxbs9" podUID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerName="registry-server" probeResult="failure" output=< Feb 19 11:25:22 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 11:25:22 crc kubenswrapper[4873]: > Feb 19 11:25:25 crc kubenswrapper[4873]: I0219 11:25:25.485228 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:25:26 crc kubenswrapper[4873]: I0219 11:25:26.690920 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"94efb2b7f91f96b952ade76f54d8cf096f4da7e422f2f8758aaf2ca9208fbda3"} Feb 19 11:25:27 crc kubenswrapper[4873]: I0219 11:25:27.862263 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:27 crc kubenswrapper[4873]: I0219 11:25:27.862621 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:28 crc kubenswrapper[4873]: I0219 11:25:28.906662 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vh782" podUID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerName="registry-server" probeResult="failure" output=< Feb 19 11:25:28 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 11:25:28 crc kubenswrapper[4873]: > Feb 19 11:25:31 crc kubenswrapper[4873]: I0219 11:25:31.298339 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:31 crc kubenswrapper[4873]: I0219 11:25:31.352506 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:31 crc kubenswrapper[4873]: I0219 11:25:31.539765 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxbs9"] Feb 19 11:25:32 crc kubenswrapper[4873]: I0219 11:25:32.742890 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sxbs9" podUID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerName="registry-server" containerID="cri-o://a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da" gracePeriod=2 Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.297252 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.401225 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9p5p\" (UniqueName: \"kubernetes.io/projected/e39282f2-483a-457f-9a81-ed6faf0794a2-kube-api-access-r9p5p\") pod \"e39282f2-483a-457f-9a81-ed6faf0794a2\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.401551 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-catalog-content\") pod \"e39282f2-483a-457f-9a81-ed6faf0794a2\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.401645 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-utilities\") pod \"e39282f2-483a-457f-9a81-ed6faf0794a2\" (UID: \"e39282f2-483a-457f-9a81-ed6faf0794a2\") " Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.402369 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-utilities" (OuterVolumeSpecName: "utilities") pod "e39282f2-483a-457f-9a81-ed6faf0794a2" (UID: "e39282f2-483a-457f-9a81-ed6faf0794a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.407950 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e39282f2-483a-457f-9a81-ed6faf0794a2-kube-api-access-r9p5p" (OuterVolumeSpecName: "kube-api-access-r9p5p") pod "e39282f2-483a-457f-9a81-ed6faf0794a2" (UID: "e39282f2-483a-457f-9a81-ed6faf0794a2"). InnerVolumeSpecName "kube-api-access-r9p5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.451261 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e39282f2-483a-457f-9a81-ed6faf0794a2" (UID: "e39282f2-483a-457f-9a81-ed6faf0794a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.503746 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.503792 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9p5p\" (UniqueName: \"kubernetes.io/projected/e39282f2-483a-457f-9a81-ed6faf0794a2-kube-api-access-r9p5p\") on node \"crc\" DevicePath \"\"" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.503806 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e39282f2-483a-457f-9a81-ed6faf0794a2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.756528 4873 generic.go:334] "Generic (PLEG): container finished" podID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerID="a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da" exitCode=0 Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.756629 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxbs9" event={"ID":"e39282f2-483a-457f-9a81-ed6faf0794a2","Type":"ContainerDied","Data":"a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da"} Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.756923 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxbs9" event={"ID":"e39282f2-483a-457f-9a81-ed6faf0794a2","Type":"ContainerDied","Data":"6d80a83b03b357f5f4bbd1bca810d6a76cb6aaf294b677f996036db6833706e1"} Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.756656 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxbs9" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.756959 4873 scope.go:117] "RemoveContainer" containerID="a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.797640 4873 scope.go:117] "RemoveContainer" containerID="c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.798361 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxbs9"] Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.812701 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxbs9"] Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.817181 4873 scope.go:117] "RemoveContainer" containerID="5f164c85151d9a6b6ad768d870ce77d344a5947bf1dcc7d03e716b30ff844993" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.873775 4873 scope.go:117] "RemoveContainer" containerID="a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da" Feb 19 11:25:33 crc kubenswrapper[4873]: E0219 11:25:33.874290 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da\": container with ID starting with a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da not found: ID does not exist" containerID="a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.874356 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da"} err="failed to get container status \"a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da\": rpc error: code = NotFound desc = could not find container \"a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da\": container with ID starting with a622611ca13929d3f89f9fe1cf634a4f477dfef54420733eea69ed9789d819da not found: ID does not exist" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.874384 4873 scope.go:117] "RemoveContainer" containerID="c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af" Feb 19 11:25:33 crc kubenswrapper[4873]: E0219 11:25:33.874731 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af\": container with ID starting with c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af not found: ID does not exist" containerID="c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.874772 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af"} err="failed to get container status \"c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af\": rpc error: code = NotFound desc = could not find container \"c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af\": container with ID starting with c6651c8df75a6ece9b8f0a0290c6230600543e4bbf34a68e0e3e25ac6e16e8af not found: ID does not exist" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.874803 4873 scope.go:117] "RemoveContainer" containerID="5f164c85151d9a6b6ad768d870ce77d344a5947bf1dcc7d03e716b30ff844993" Feb 19 11:25:33 crc kubenswrapper[4873]: E0219 11:25:33.875211 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f164c85151d9a6b6ad768d870ce77d344a5947bf1dcc7d03e716b30ff844993\": container with ID starting with 5f164c85151d9a6b6ad768d870ce77d344a5947bf1dcc7d03e716b30ff844993 not found: ID does not exist" containerID="5f164c85151d9a6b6ad768d870ce77d344a5947bf1dcc7d03e716b30ff844993" Feb 19 11:25:33 crc kubenswrapper[4873]: I0219 11:25:33.875255 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f164c85151d9a6b6ad768d870ce77d344a5947bf1dcc7d03e716b30ff844993"} err="failed to get container status \"5f164c85151d9a6b6ad768d870ce77d344a5947bf1dcc7d03e716b30ff844993\": rpc error: code = NotFound desc = could not find container \"5f164c85151d9a6b6ad768d870ce77d344a5947bf1dcc7d03e716b30ff844993\": container with ID starting with 5f164c85151d9a6b6ad768d870ce77d344a5947bf1dcc7d03e716b30ff844993 not found: ID does not exist" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.345759 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vfjzm"] Feb 19 11:25:34 crc kubenswrapper[4873]: E0219 11:25:34.346222 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerName="registry-server" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.346243 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerName="registry-server" Feb 19 11:25:34 crc kubenswrapper[4873]: E0219 11:25:34.346287 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerName="extract-utilities" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.346296 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerName="extract-utilities" Feb 19 11:25:34 crc kubenswrapper[4873]: E0219 11:25:34.346321 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerName="extract-content" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.346329 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerName="extract-content" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.346560 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="e39282f2-483a-457f-9a81-ed6faf0794a2" containerName="registry-server" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.348462 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.393696 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vfjzm"] Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.419225 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-catalog-content\") pod \"community-operators-vfjzm\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.419297 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6nkc\" (UniqueName: \"kubernetes.io/projected/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-kube-api-access-p6nkc\") pod \"community-operators-vfjzm\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.419369 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-utilities\") pod \"community-operators-vfjzm\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.521548 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-catalog-content\") pod \"community-operators-vfjzm\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.521965 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6nkc\" (UniqueName: \"kubernetes.io/projected/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-kube-api-access-p6nkc\") pod \"community-operators-vfjzm\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.522039 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-utilities\") pod \"community-operators-vfjzm\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.522735 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-utilities\") pod \"community-operators-vfjzm\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.522822 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-catalog-content\") pod \"community-operators-vfjzm\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.544598 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6nkc\" (UniqueName: \"kubernetes.io/projected/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-kube-api-access-p6nkc\") pod \"community-operators-vfjzm\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.671207 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:34 crc kubenswrapper[4873]: I0219 11:25:34.808969 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-v7nww_5d79d4d8-e595-4aec-bc0b-7347b826c257/prometheus-operator/0.log" Feb 19 11:25:35 crc kubenswrapper[4873]: I0219 11:25:35.123272 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7496f9f864-qptdb_4724c979-0040-4017-86ce-78d2a8bdb44e/prometheus-operator-admission-webhook/0.log" Feb 19 11:25:35 crc kubenswrapper[4873]: I0219 11:25:35.142724 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7_3180318c-7d9a-454b-8de4-887fabae362b/prometheus-operator-admission-webhook/0.log" Feb 19 11:25:35 crc kubenswrapper[4873]: I0219 11:25:35.298047 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vfjzm"] Feb 19 11:25:35 crc kubenswrapper[4873]: W0219 11:25:35.301978 4873 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e9e14ea_5f94_4828_9b90_06e1b92b6e87.slice/crio-b5bed3859900a512dacb242a8fff7e5f63835fd00ecc36327832ce7b28da3a82 WatchSource:0}: Error finding container b5bed3859900a512dacb242a8fff7e5f63835fd00ecc36327832ce7b28da3a82: Status 404 returned error can't find the container with id b5bed3859900a512dacb242a8fff7e5f63835fd00ecc36327832ce7b28da3a82 Feb 19 11:25:35 crc kubenswrapper[4873]: I0219 11:25:35.370076 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-7wtlv_b23281d2-935e-47c1-bc83-8d00c7649625/operator/0.log" Feb 19 11:25:35 crc kubenswrapper[4873]: I0219 11:25:35.407674 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-8sflg_ea1cc2c7-c932-4b3d-b718-d017eb06163f/perses-operator/0.log" Feb 19 11:25:35 crc kubenswrapper[4873]: I0219 11:25:35.499157 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e39282f2-483a-457f-9a81-ed6faf0794a2" path="/var/lib/kubelet/pods/e39282f2-483a-457f-9a81-ed6faf0794a2/volumes" Feb 19 11:25:35 crc kubenswrapper[4873]: I0219 11:25:35.812130 4873 generic.go:334] "Generic (PLEG): container finished" podID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" containerID="5973cfd0614ad827001f3dd79c532a2ada6c7975a5ffe3c0aec2b940fa38f78e" exitCode=0 Feb 19 11:25:35 crc kubenswrapper[4873]: I0219 11:25:35.812284 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfjzm" event={"ID":"8e9e14ea-5f94-4828-9b90-06e1b92b6e87","Type":"ContainerDied","Data":"5973cfd0614ad827001f3dd79c532a2ada6c7975a5ffe3c0aec2b940fa38f78e"} Feb 19 11:25:35 crc kubenswrapper[4873]: I0219 11:25:35.813196 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfjzm" event={"ID":"8e9e14ea-5f94-4828-9b90-06e1b92b6e87","Type":"ContainerStarted","Data":"b5bed3859900a512dacb242a8fff7e5f63835fd00ecc36327832ce7b28da3a82"} Feb 19 11:25:35 crc kubenswrapper[4873]: I0219 11:25:35.814470 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 11:25:36 crc kubenswrapper[4873]: I0219 11:25:36.824564 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfjzm" event={"ID":"8e9e14ea-5f94-4828-9b90-06e1b92b6e87","Type":"ContainerStarted","Data":"3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea"} Feb 19 11:25:38 crc kubenswrapper[4873]: I0219 11:25:38.850926 4873 generic.go:334] "Generic (PLEG): container finished" podID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" containerID="3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea" exitCode=0 Feb 19 11:25:38 crc kubenswrapper[4873]: I0219 11:25:38.851360 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfjzm" event={"ID":"8e9e14ea-5f94-4828-9b90-06e1b92b6e87","Type":"ContainerDied","Data":"3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea"} Feb 19 11:25:38 crc kubenswrapper[4873]: I0219 11:25:38.915139 4873 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vh782" podUID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerName="registry-server" probeResult="failure" output=< Feb 19 11:25:38 crc kubenswrapper[4873]: timeout: failed to connect service ":50051" within 1s Feb 19 11:25:38 crc kubenswrapper[4873]: > Feb 19 11:25:39 crc kubenswrapper[4873]: I0219 11:25:39.867389 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfjzm" event={"ID":"8e9e14ea-5f94-4828-9b90-06e1b92b6e87","Type":"ContainerStarted","Data":"b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4"} Feb 19 11:25:39 crc kubenswrapper[4873]: I0219 11:25:39.887327 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vfjzm" podStartSLOduration=2.437900938 podStartE2EDuration="5.887304006s" podCreationTimestamp="2026-02-19 11:25:34 +0000 UTC" firstStartedPulling="2026-02-19 11:25:35.814228353 +0000 UTC m=+6045.103659991" lastFinishedPulling="2026-02-19 11:25:39.263631411 +0000 UTC m=+6048.553063059" observedRunningTime="2026-02-19 11:25:39.885175112 +0000 UTC m=+6049.174606770" watchObservedRunningTime="2026-02-19 11:25:39.887304006 +0000 UTC m=+6049.176735644" Feb 19 11:25:44 crc kubenswrapper[4873]: I0219 11:25:44.672407 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:44 crc kubenswrapper[4873]: I0219 11:25:44.673477 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:44 crc kubenswrapper[4873]: I0219 11:25:44.715339 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:44 crc kubenswrapper[4873]: I0219 11:25:44.976612 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:45 crc kubenswrapper[4873]: I0219 11:25:45.021142 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vfjzm"] Feb 19 11:25:46 crc kubenswrapper[4873]: I0219 11:25:46.931644 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vfjzm" podUID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" containerName="registry-server" containerID="cri-o://b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4" gracePeriod=2 Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.436759 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.496531 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6nkc\" (UniqueName: \"kubernetes.io/projected/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-kube-api-access-p6nkc\") pod \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.496660 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-catalog-content\") pod \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.496739 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-utilities\") pod \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\" (UID: \"8e9e14ea-5f94-4828-9b90-06e1b92b6e87\") " Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.498246 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-utilities" (OuterVolumeSpecName: "utilities") pod "8e9e14ea-5f94-4828-9b90-06e1b92b6e87" (UID: "8e9e14ea-5f94-4828-9b90-06e1b92b6e87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.503447 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-kube-api-access-p6nkc" (OuterVolumeSpecName: "kube-api-access-p6nkc") pod "8e9e14ea-5f94-4828-9b90-06e1b92b6e87" (UID: "8e9e14ea-5f94-4828-9b90-06e1b92b6e87"). InnerVolumeSpecName "kube-api-access-p6nkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.564762 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e9e14ea-5f94-4828-9b90-06e1b92b6e87" (UID: "8e9e14ea-5f94-4828-9b90-06e1b92b6e87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.599871 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.599903 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.599915 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6nkc\" (UniqueName: \"kubernetes.io/projected/8e9e14ea-5f94-4828-9b90-06e1b92b6e87-kube-api-access-p6nkc\") on node \"crc\" DevicePath \"\"" Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.932404 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.945849 4873 generic.go:334] "Generic (PLEG): container finished" podID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" containerID="b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4" exitCode=0 Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.945894 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfjzm" event={"ID":"8e9e14ea-5f94-4828-9b90-06e1b92b6e87","Type":"ContainerDied","Data":"b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4"} Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.945922 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vfjzm" event={"ID":"8e9e14ea-5f94-4828-9b90-06e1b92b6e87","Type":"ContainerDied","Data":"b5bed3859900a512dacb242a8fff7e5f63835fd00ecc36327832ce7b28da3a82"} Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.945943 4873 scope.go:117] "RemoveContainer" containerID="b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4" Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.946124 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vfjzm" Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.985881 4873 scope.go:117] "RemoveContainer" containerID="3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea" Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.991325 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vfjzm"] Feb 19 11:25:47 crc kubenswrapper[4873]: I0219 11:25:47.993726 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:48 crc kubenswrapper[4873]: I0219 11:25:48.002191 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vfjzm"] Feb 19 11:25:48 crc kubenswrapper[4873]: I0219 11:25:48.013238 4873 scope.go:117] "RemoveContainer" containerID="5973cfd0614ad827001f3dd79c532a2ada6c7975a5ffe3c0aec2b940fa38f78e" Feb 19 11:25:48 crc kubenswrapper[4873]: I0219 11:25:48.082774 4873 scope.go:117] "RemoveContainer" containerID="b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4" Feb 19 11:25:48 crc kubenswrapper[4873]: E0219 11:25:48.086659 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4\": container with ID starting with b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4 not found: ID does not exist" containerID="b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4" Feb 19 11:25:48 crc kubenswrapper[4873]: I0219 11:25:48.086708 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4"} err="failed to get container status \"b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4\": rpc error: code = NotFound desc = could not find container \"b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4\": container with ID starting with b6bc77f409177594424c5b1375a9ee42663caf3262a2565a4e7e5b184266f6e4 not found: ID does not exist" Feb 19 11:25:48 crc kubenswrapper[4873]: I0219 11:25:48.086737 4873 scope.go:117] "RemoveContainer" containerID="3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea" Feb 19 11:25:48 crc kubenswrapper[4873]: E0219 11:25:48.087180 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea\": container with ID starting with 3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea not found: ID does not exist" containerID="3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea" Feb 19 11:25:48 crc kubenswrapper[4873]: I0219 11:25:48.087203 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea"} err="failed to get container status \"3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea\": rpc error: code = NotFound desc = could not find container \"3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea\": container with ID starting with 3118c73605c071b588191b2928a7c45823fab027d88c7b1c6466031a472caeea not found: ID does not exist" Feb 19 11:25:48 crc kubenswrapper[4873]: I0219 11:25:48.087216 4873 scope.go:117] "RemoveContainer" containerID="5973cfd0614ad827001f3dd79c532a2ada6c7975a5ffe3c0aec2b940fa38f78e" Feb 19 11:25:48 crc kubenswrapper[4873]: E0219 11:25:48.087450 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5973cfd0614ad827001f3dd79c532a2ada6c7975a5ffe3c0aec2b940fa38f78e\": container with ID starting with 5973cfd0614ad827001f3dd79c532a2ada6c7975a5ffe3c0aec2b940fa38f78e not found: ID does not exist" containerID="5973cfd0614ad827001f3dd79c532a2ada6c7975a5ffe3c0aec2b940fa38f78e" Feb 19 11:25:48 crc kubenswrapper[4873]: I0219 11:25:48.087474 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5973cfd0614ad827001f3dd79c532a2ada6c7975a5ffe3c0aec2b940fa38f78e"} err="failed to get container status \"5973cfd0614ad827001f3dd79c532a2ada6c7975a5ffe3c0aec2b940fa38f78e\": rpc error: code = NotFound desc = could not find container \"5973cfd0614ad827001f3dd79c532a2ada6c7975a5ffe3c0aec2b940fa38f78e\": container with ID starting with 5973cfd0614ad827001f3dd79c532a2ada6c7975a5ffe3c0aec2b940fa38f78e not found: ID does not exist" Feb 19 11:25:48 crc kubenswrapper[4873]: I0219 11:25:48.551648 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vh782"] Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.272644 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7t964_4a42b4a3-c207-40a8-80b9-0532a0ec2865/kube-rbac-proxy/0.log" Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.325815 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7t964_4a42b4a3-c207-40a8-80b9-0532a0ec2865/controller/0.log" Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.471575 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-frr-files/0.log" Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.498427 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" path="/var/lib/kubelet/pods/8e9e14ea-5f94-4828-9b90-06e1b92b6e87/volumes" Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.689258 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-frr-files/0.log" Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.698229 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-reloader/0.log" Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.713662 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-reloader/0.log" Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.720355 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-metrics/0.log" Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.965142 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vh782" podUID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerName="registry-server" containerID="cri-o://5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819" gracePeriod=2 Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.972236 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-metrics/0.log" Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.972252 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-frr-files/0.log" Feb 19 11:25:49 crc kubenswrapper[4873]: I0219 11:25:49.980191 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-reloader/0.log" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.052192 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-metrics/0.log" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.239777 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-reloader/0.log" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.241409 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-frr-files/0.log" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.258029 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/cp-metrics/0.log" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.305042 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/controller/0.log" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.481719 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/kube-rbac-proxy/0.log" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.499529 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/kube-rbac-proxy-frr/0.log" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.508919 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.556789 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/frr-metrics/0.log" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.668797 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdqsh\" (UniqueName: \"kubernetes.io/projected/2f11d216-e951-49a8-9728-9348dd3e09ab-kube-api-access-wdqsh\") pod \"2f11d216-e951-49a8-9728-9348dd3e09ab\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.669036 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-catalog-content\") pod \"2f11d216-e951-49a8-9728-9348dd3e09ab\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.669341 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-utilities\") pod \"2f11d216-e951-49a8-9728-9348dd3e09ab\" (UID: \"2f11d216-e951-49a8-9728-9348dd3e09ab\") " Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.670411 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-utilities" (OuterVolumeSpecName: "utilities") pod "2f11d216-e951-49a8-9728-9348dd3e09ab" (UID: "2f11d216-e951-49a8-9728-9348dd3e09ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.684134 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f11d216-e951-49a8-9728-9348dd3e09ab-kube-api-access-wdqsh" (OuterVolumeSpecName: "kube-api-access-wdqsh") pod "2f11d216-e951-49a8-9728-9348dd3e09ab" (UID: "2f11d216-e951-49a8-9728-9348dd3e09ab"). InnerVolumeSpecName "kube-api-access-wdqsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.755290 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/reloader/0.log" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.773647 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.773882 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdqsh\" (UniqueName: \"kubernetes.io/projected/2f11d216-e951-49a8-9728-9348dd3e09ab-kube-api-access-wdqsh\") on node \"crc\" DevicePath \"\"" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.809777 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f11d216-e951-49a8-9728-9348dd3e09ab" (UID: "2f11d216-e951-49a8-9728-9348dd3e09ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.816438 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-xwr52_8d8f9aee-601f-4530-876b-83709311196b/frr-k8s-webhook-server/0.log" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.875678 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f11d216-e951-49a8-9728-9348dd3e09ab-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.978736 4873 generic.go:334] "Generic (PLEG): container finished" podID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerID="5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819" exitCode=0 Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.978799 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh782" event={"ID":"2f11d216-e951-49a8-9728-9348dd3e09ab","Type":"ContainerDied","Data":"5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819"} Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.979136 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh782" event={"ID":"2f11d216-e951-49a8-9728-9348dd3e09ab","Type":"ContainerDied","Data":"b05d7372ec51e0442e4648a2dc4cd081b94c1b9c9a02e0ea0e267bcfe4e90abb"} Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.979244 4873 scope.go:117] "RemoveContainer" containerID="5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819" Feb 19 11:25:50 crc kubenswrapper[4873]: I0219 11:25:50.978818 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh782" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.010331 4873 scope.go:117] "RemoveContainer" containerID="4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.037938 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vh782"] Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.047915 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vh782"] Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.075997 4873 scope.go:117] "RemoveContainer" containerID="edb548abbd6f9dd1cc88a622635559640a64f2ec1f91e50fae7650e337dd57f4" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.082080 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6897955989-f6tl8_94f344cf-0f09-4812-ab40-dcce7f260a53/manager/0.log" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.098266 4873 scope.go:117] "RemoveContainer" containerID="5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819" Feb 19 11:25:51 crc kubenswrapper[4873]: E0219 11:25:51.099252 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819\": container with ID starting with 5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819 not found: ID does not exist" containerID="5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.099291 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819"} err="failed to get container status \"5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819\": rpc error: code = NotFound desc = could not find container \"5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819\": container with ID starting with 5f8392fe03ecbd700fd918ae9af7167011d155c907cde8e0623f08621d930819 not found: ID does not exist" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.099316 4873 scope.go:117] "RemoveContainer" containerID="4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55" Feb 19 11:25:51 crc kubenswrapper[4873]: E0219 11:25:51.100563 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55\": container with ID starting with 4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55 not found: ID does not exist" containerID="4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.100597 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55"} err="failed to get container status \"4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55\": rpc error: code = NotFound desc = could not find container \"4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55\": container with ID starting with 4b5582085ad38131d740eddcd5c773d88f2d945d5cf67562d6dc6da86ee55b55 not found: ID does not exist" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.100618 4873 scope.go:117] "RemoveContainer" containerID="edb548abbd6f9dd1cc88a622635559640a64f2ec1f91e50fae7650e337dd57f4" Feb 19 11:25:51 crc kubenswrapper[4873]: E0219 11:25:51.103313 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb548abbd6f9dd1cc88a622635559640a64f2ec1f91e50fae7650e337dd57f4\": container with ID starting with edb548abbd6f9dd1cc88a622635559640a64f2ec1f91e50fae7650e337dd57f4 not found: ID does not exist" containerID="edb548abbd6f9dd1cc88a622635559640a64f2ec1f91e50fae7650e337dd57f4" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.103360 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb548abbd6f9dd1cc88a622635559640a64f2ec1f91e50fae7650e337dd57f4"} err="failed to get container status \"edb548abbd6f9dd1cc88a622635559640a64f2ec1f91e50fae7650e337dd57f4\": rpc error: code = NotFound desc = could not find container \"edb548abbd6f9dd1cc88a622635559640a64f2ec1f91e50fae7650e337dd57f4\": container with ID starting with edb548abbd6f9dd1cc88a622635559640a64f2ec1f91e50fae7650e337dd57f4 not found: ID does not exist" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.315550 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7bf7457c95-rq2ph_e9d29e18-f362-478f-911d-ed979e43aae1/webhook-server/0.log" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.412017 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-phsr6_46cac2a1-6c87-4c4e-a73f-92dbee290015/kube-rbac-proxy/0.log" Feb 19 11:25:51 crc kubenswrapper[4873]: I0219 11:25:51.500742 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f11d216-e951-49a8-9728-9348dd3e09ab" path="/var/lib/kubelet/pods/2f11d216-e951-49a8-9728-9348dd3e09ab/volumes" Feb 19 11:25:52 crc kubenswrapper[4873]: I0219 11:25:52.040527 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-phsr6_46cac2a1-6c87-4c4e-a73f-92dbee290015/speaker/0.log" Feb 19 11:25:52 crc kubenswrapper[4873]: I0219 11:25:52.386204 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w8fjg_76ea40c9-c4a3-4a32-82a5-d725a73db80d/frr/0.log" Feb 19 11:26:04 crc kubenswrapper[4873]: I0219 11:26:04.044693 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/util/0.log" Feb 19 11:26:04 crc kubenswrapper[4873]: I0219 11:26:04.179095 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/util/0.log" Feb 19 11:26:04 crc kubenswrapper[4873]: I0219 11:26:04.254292 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/pull/0.log" Feb 19 11:26:04 crc kubenswrapper[4873]: I0219 11:26:04.294511 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/pull/0.log" Feb 19 11:26:04 crc kubenswrapper[4873]: I0219 11:26:04.456995 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/util/0.log" Feb 19 11:26:04 crc kubenswrapper[4873]: I0219 11:26:04.479946 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/extract/0.log" Feb 19 11:26:04 crc kubenswrapper[4873]: I0219 11:26:04.482975 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f088qsgn_0709e82b-60e9-4aed-8e42-e39928e74c21/pull/0.log" Feb 19 11:26:04 crc kubenswrapper[4873]: I0219 11:26:04.659022 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/util/0.log" Feb 19 11:26:04 crc kubenswrapper[4873]: I0219 11:26:04.835392 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/pull/0.log" Feb 19 11:26:04 crc kubenswrapper[4873]: I0219 11:26:04.852510 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/util/0.log" Feb 19 11:26:04 crc kubenswrapper[4873]: I0219 11:26:04.877561 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/pull/0.log" Feb 19 11:26:05 crc kubenswrapper[4873]: I0219 11:26:05.032006 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/util/0.log" Feb 19 11:26:05 crc kubenswrapper[4873]: I0219 11:26:05.041113 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/pull/0.log" Feb 19 11:26:05 crc kubenswrapper[4873]: I0219 11:26:05.051373 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213xwxbf_7a09955d-14f6-4877-bcb4-701d57165495/extract/0.log" Feb 19 11:26:05 crc kubenswrapper[4873]: I0219 11:26:05.217209 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-utilities/0.log" Feb 19 11:26:05 crc kubenswrapper[4873]: I0219 11:26:05.424066 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-utilities/0.log" Feb 19 11:26:05 crc kubenswrapper[4873]: I0219 11:26:05.431278 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-content/0.log" Feb 19 11:26:05 crc kubenswrapper[4873]: I0219 11:26:05.466083 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-content/0.log" Feb 19 11:26:05 crc kubenswrapper[4873]: I0219 11:26:05.610365 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-utilities/0.log" Feb 19 11:26:05 crc kubenswrapper[4873]: I0219 11:26:05.701324 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/extract-content/0.log" Feb 19 11:26:05 crc kubenswrapper[4873]: I0219 11:26:05.926568 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-utilities/0.log" Feb 19 11:26:06 crc kubenswrapper[4873]: I0219 11:26:06.067919 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-content/0.log" Feb 19 11:26:06 crc kubenswrapper[4873]: I0219 11:26:06.074262 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-utilities/0.log" Feb 19 11:26:06 crc kubenswrapper[4873]: I0219 11:26:06.176558 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bdcwz_d27fce7f-0ae7-4e22-885f-ad2a398647cc/registry-server/0.log" Feb 19 11:26:06 crc kubenswrapper[4873]: I0219 11:26:06.211259 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-content/0.log" Feb 19 11:26:06 crc kubenswrapper[4873]: I0219 11:26:06.388685 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-utilities/0.log" Feb 19 11:26:06 crc kubenswrapper[4873]: I0219 11:26:06.408746 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/extract-content/0.log" Feb 19 11:26:06 crc kubenswrapper[4873]: I0219 11:26:06.655910 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/util/0.log" Feb 19 11:26:06 crc kubenswrapper[4873]: I0219 11:26:06.851166 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/pull/0.log" Feb 19 11:26:06 crc kubenswrapper[4873]: I0219 11:26:06.869336 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/util/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.013564 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/pull/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.150067 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/util/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.238292 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/pull/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.291298 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecaplfxv_14a07337-b89d-4574-aa0f-f9a3cdebdd48/extract/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.358320 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zk9wc_5f466b31-21ca-4f19-9b73-72cfb7c68d55/registry-server/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.473663 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jt9rj_1d58439b-31c6-44df-a32d-48f0fcb6a361/marketplace-operator/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.596235 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-utilities/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.745739 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-content/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.752092 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-utilities/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.788872 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-content/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.958214 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-utilities/0.log" Feb 19 11:26:07 crc kubenswrapper[4873]: I0219 11:26:07.997557 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/extract-content/0.log" Feb 19 11:26:08 crc kubenswrapper[4873]: I0219 11:26:08.163405 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-utilities/0.log" Feb 19 11:26:08 crc kubenswrapper[4873]: I0219 11:26:08.220983 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xvshp_f9a9b521-3ed0-40c1-b38f-34c21bd9c242/registry-server/0.log" Feb 19 11:26:08 crc kubenswrapper[4873]: I0219 11:26:08.360745 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-content/0.log" Feb 19 11:26:08 crc kubenswrapper[4873]: I0219 11:26:08.431611 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-content/0.log" Feb 19 11:26:08 crc kubenswrapper[4873]: I0219 11:26:08.443331 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-utilities/0.log" Feb 19 11:26:08 crc kubenswrapper[4873]: I0219 11:26:08.564058 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-utilities/0.log" Feb 19 11:26:08 crc kubenswrapper[4873]: I0219 11:26:08.587464 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/extract-content/0.log" Feb 19 11:26:09 crc kubenswrapper[4873]: I0219 11:26:09.343937 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prw4c_4cc54252-cfdf-4b71-bfa5-552dcd26500d/registry-server/0.log" Feb 19 11:26:21 crc kubenswrapper[4873]: I0219 11:26:21.066546 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7496f9f864-qptdb_4724c979-0040-4017-86ce-78d2a8bdb44e/prometheus-operator-admission-webhook/0.log" Feb 19 11:26:21 crc kubenswrapper[4873]: I0219 11:26:21.076426 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-v7nww_5d79d4d8-e595-4aec-bc0b-7347b826c257/prometheus-operator/0.log" Feb 19 11:26:21 crc kubenswrapper[4873]: I0219 11:26:21.139066 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7496f9f864-zxpx7_3180318c-7d9a-454b-8de4-887fabae362b/prometheus-operator-admission-webhook/0.log" Feb 19 11:26:21 crc kubenswrapper[4873]: I0219 11:26:21.249478 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-7wtlv_b23281d2-935e-47c1-bc83-8d00c7649625/operator/0.log" Feb 19 11:26:21 crc kubenswrapper[4873]: I0219 11:26:21.282653 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-8sflg_ea1cc2c7-c932-4b3d-b718-d017eb06163f/perses-operator/0.log" Feb 19 11:27:48 crc kubenswrapper[4873]: I0219 11:27:48.240256 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:27:48 crc kubenswrapper[4873]: I0219 11:27:48.242322 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:28:18 crc kubenswrapper[4873]: I0219 11:28:18.240360 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:28:18 crc kubenswrapper[4873]: I0219 11:28:18.240888 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:28:34 crc kubenswrapper[4873]: I0219 11:28:34.603860 4873 generic.go:334] "Generic (PLEG): container finished" podID="93cc0682-3903-4dad-a4a1-3e807492bab4" containerID="ff3687756e2207c400bf2bbbc9410e3f3ee429ff1507157450e53ef945e6af06" exitCode=0 Feb 19 11:28:34 crc kubenswrapper[4873]: I0219 11:28:34.603965 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qhpxg/must-gather-vhpx8" event={"ID":"93cc0682-3903-4dad-a4a1-3e807492bab4","Type":"ContainerDied","Data":"ff3687756e2207c400bf2bbbc9410e3f3ee429ff1507157450e53ef945e6af06"} Feb 19 11:28:34 crc kubenswrapper[4873]: I0219 11:28:34.605043 4873 scope.go:117] "RemoveContainer" containerID="ff3687756e2207c400bf2bbbc9410e3f3ee429ff1507157450e53ef945e6af06" Feb 19 11:28:35 crc kubenswrapper[4873]: I0219 11:28:35.634723 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qhpxg_must-gather-vhpx8_93cc0682-3903-4dad-a4a1-3e807492bab4/gather/0.log" Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.240590 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.241185 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.241235 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.242079 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94efb2b7f91f96b952ade76f54d8cf096f4da7e422f2f8758aaf2ca9208fbda3"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.242147 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://94efb2b7f91f96b952ade76f54d8cf096f4da7e422f2f8758aaf2ca9208fbda3" gracePeriod=600 Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.492160 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qhpxg/must-gather-vhpx8"] Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.492946 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qhpxg/must-gather-vhpx8" podUID="93cc0682-3903-4dad-a4a1-3e807492bab4" containerName="copy" containerID="cri-o://767ace8d23069d52fe292289a53031e59a4a02afb307e1f96188d5747c12d9df" gracePeriod=2 Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.542441 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qhpxg/must-gather-vhpx8"] Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.765339 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="94efb2b7f91f96b952ade76f54d8cf096f4da7e422f2f8758aaf2ca9208fbda3" exitCode=0 Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.765722 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"94efb2b7f91f96b952ade76f54d8cf096f4da7e422f2f8758aaf2ca9208fbda3"} Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.765759 4873 scope.go:117] "RemoveContainer" containerID="98422463e0fc17a554205c92a5d79c560a2847f0d82d233735f9770c7f6f4838" Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.768978 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qhpxg_must-gather-vhpx8_93cc0682-3903-4dad-a4a1-3e807492bab4/copy/0.log" Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.769431 4873 generic.go:334] "Generic (PLEG): container finished" podID="93cc0682-3903-4dad-a4a1-3e807492bab4" containerID="767ace8d23069d52fe292289a53031e59a4a02afb307e1f96188d5747c12d9df" exitCode=143 Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.932479 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qhpxg_must-gather-vhpx8_93cc0682-3903-4dad-a4a1-3e807492bab4/copy/0.log" Feb 19 11:28:48 crc kubenswrapper[4873]: I0219 11:28:48.932991 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/must-gather-vhpx8" Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.092091 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfvlt\" (UniqueName: \"kubernetes.io/projected/93cc0682-3903-4dad-a4a1-3e807492bab4-kube-api-access-nfvlt\") pod \"93cc0682-3903-4dad-a4a1-3e807492bab4\" (UID: \"93cc0682-3903-4dad-a4a1-3e807492bab4\") " Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.092485 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93cc0682-3903-4dad-a4a1-3e807492bab4-must-gather-output\") pod \"93cc0682-3903-4dad-a4a1-3e807492bab4\" (UID: \"93cc0682-3903-4dad-a4a1-3e807492bab4\") " Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.098594 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93cc0682-3903-4dad-a4a1-3e807492bab4-kube-api-access-nfvlt" (OuterVolumeSpecName: "kube-api-access-nfvlt") pod "93cc0682-3903-4dad-a4a1-3e807492bab4" (UID: "93cc0682-3903-4dad-a4a1-3e807492bab4"). InnerVolumeSpecName "kube-api-access-nfvlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.194987 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfvlt\" (UniqueName: \"kubernetes.io/projected/93cc0682-3903-4dad-a4a1-3e807492bab4-kube-api-access-nfvlt\") on node \"crc\" DevicePath \"\"" Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.289730 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93cc0682-3903-4dad-a4a1-3e807492bab4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "93cc0682-3903-4dad-a4a1-3e807492bab4" (UID: "93cc0682-3903-4dad-a4a1-3e807492bab4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.296838 4873 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93cc0682-3903-4dad-a4a1-3e807492bab4-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.500979 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93cc0682-3903-4dad-a4a1-3e807492bab4" path="/var/lib/kubelet/pods/93cc0682-3903-4dad-a4a1-3e807492bab4/volumes" Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.783864 4873 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qhpxg_must-gather-vhpx8_93cc0682-3903-4dad-a4a1-3e807492bab4/copy/0.log" Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.784867 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qhpxg/must-gather-vhpx8" Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.784938 4873 scope.go:117] "RemoveContainer" containerID="767ace8d23069d52fe292289a53031e59a4a02afb307e1f96188d5747c12d9df" Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.794077 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerStarted","Data":"c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b"} Feb 19 11:28:49 crc kubenswrapper[4873]: I0219 11:28:49.837318 4873 scope.go:117] "RemoveContainer" containerID="ff3687756e2207c400bf2bbbc9410e3f3ee429ff1507157450e53ef945e6af06" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.161672 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w"] Feb 19 11:30:00 crc kubenswrapper[4873]: E0219 11:30:00.162860 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" containerName="extract-content" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.162873 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" containerName="extract-content" Feb 19 11:30:00 crc kubenswrapper[4873]: E0219 11:30:00.162892 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93cc0682-3903-4dad-a4a1-3e807492bab4" containerName="gather" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.162899 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="93cc0682-3903-4dad-a4a1-3e807492bab4" containerName="gather" Feb 19 11:30:00 crc kubenswrapper[4873]: E0219 11:30:00.162911 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerName="registry-server" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.162921 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerName="registry-server" Feb 19 11:30:00 crc kubenswrapper[4873]: E0219 11:30:00.162937 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerName="extract-utilities" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.162943 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerName="extract-utilities" Feb 19 11:30:00 crc kubenswrapper[4873]: E0219 11:30:00.162954 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerName="extract-content" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.162960 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerName="extract-content" Feb 19 11:30:00 crc kubenswrapper[4873]: E0219 11:30:00.162972 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93cc0682-3903-4dad-a4a1-3e807492bab4" containerName="copy" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.162978 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="93cc0682-3903-4dad-a4a1-3e807492bab4" containerName="copy" Feb 19 11:30:00 crc kubenswrapper[4873]: E0219 11:30:00.162994 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" containerName="extract-utilities" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.163001 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" containerName="extract-utilities" Feb 19 11:30:00 crc kubenswrapper[4873]: E0219 11:30:00.163014 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" containerName="registry-server" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.163019 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" containerName="registry-server" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.163292 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="93cc0682-3903-4dad-a4a1-3e807492bab4" containerName="gather" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.163306 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f11d216-e951-49a8-9728-9348dd3e09ab" containerName="registry-server" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.163318 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="93cc0682-3903-4dad-a4a1-3e807492bab4" containerName="copy" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.163325 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e9e14ea-5f94-4828-9b90-06e1b92b6e87" containerName="registry-server" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.164023 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.167509 4873 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.176636 4873 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.183173 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w"] Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.201707 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txg9x\" (UniqueName: \"kubernetes.io/projected/3f7d66fd-cac8-4542-8d90-2e1604173795-kube-api-access-txg9x\") pod \"collect-profiles-29525010-drf9w\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.201896 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7d66fd-cac8-4542-8d90-2e1604173795-config-volume\") pod \"collect-profiles-29525010-drf9w\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.202137 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7d66fd-cac8-4542-8d90-2e1604173795-secret-volume\") pod \"collect-profiles-29525010-drf9w\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.303706 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7d66fd-cac8-4542-8d90-2e1604173795-secret-volume\") pod \"collect-profiles-29525010-drf9w\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.303990 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txg9x\" (UniqueName: \"kubernetes.io/projected/3f7d66fd-cac8-4542-8d90-2e1604173795-kube-api-access-txg9x\") pod \"collect-profiles-29525010-drf9w\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.304068 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7d66fd-cac8-4542-8d90-2e1604173795-config-volume\") pod \"collect-profiles-29525010-drf9w\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.304952 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7d66fd-cac8-4542-8d90-2e1604173795-config-volume\") pod \"collect-profiles-29525010-drf9w\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.312284 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7d66fd-cac8-4542-8d90-2e1604173795-secret-volume\") pod \"collect-profiles-29525010-drf9w\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.323906 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txg9x\" (UniqueName: \"kubernetes.io/projected/3f7d66fd-cac8-4542-8d90-2e1604173795-kube-api-access-txg9x\") pod \"collect-profiles-29525010-drf9w\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:00 crc kubenswrapper[4873]: I0219 11:30:00.505584 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:01 crc kubenswrapper[4873]: I0219 11:30:01.002685 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w"] Feb 19 11:30:01 crc kubenswrapper[4873]: I0219 11:30:01.781595 4873 generic.go:334] "Generic (PLEG): container finished" podID="3f7d66fd-cac8-4542-8d90-2e1604173795" containerID="49ec155c61c455c5434148fd267e5f053f30dc96bc25a2a99c29dba3e6e8a1b1" exitCode=0 Feb 19 11:30:01 crc kubenswrapper[4873]: I0219 11:30:01.781686 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" event={"ID":"3f7d66fd-cac8-4542-8d90-2e1604173795","Type":"ContainerDied","Data":"49ec155c61c455c5434148fd267e5f053f30dc96bc25a2a99c29dba3e6e8a1b1"} Feb 19 11:30:01 crc kubenswrapper[4873]: I0219 11:30:01.782766 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" event={"ID":"3f7d66fd-cac8-4542-8d90-2e1604173795","Type":"ContainerStarted","Data":"e3c7023fa3a3faa12fbb82e63d2c17d79d083ae336289d86303fd9d583194477"} Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.130364 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.277085 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txg9x\" (UniqueName: \"kubernetes.io/projected/3f7d66fd-cac8-4542-8d90-2e1604173795-kube-api-access-txg9x\") pod \"3f7d66fd-cac8-4542-8d90-2e1604173795\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.277397 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7d66fd-cac8-4542-8d90-2e1604173795-config-volume\") pod \"3f7d66fd-cac8-4542-8d90-2e1604173795\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.277489 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7d66fd-cac8-4542-8d90-2e1604173795-secret-volume\") pod \"3f7d66fd-cac8-4542-8d90-2e1604173795\" (UID: \"3f7d66fd-cac8-4542-8d90-2e1604173795\") " Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.278089 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f7d66fd-cac8-4542-8d90-2e1604173795-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f7d66fd-cac8-4542-8d90-2e1604173795" (UID: "3f7d66fd-cac8-4542-8d90-2e1604173795"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.289318 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f7d66fd-cac8-4542-8d90-2e1604173795-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f7d66fd-cac8-4542-8d90-2e1604173795" (UID: "3f7d66fd-cac8-4542-8d90-2e1604173795"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.292626 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7d66fd-cac8-4542-8d90-2e1604173795-kube-api-access-txg9x" (OuterVolumeSpecName: "kube-api-access-txg9x") pod "3f7d66fd-cac8-4542-8d90-2e1604173795" (UID: "3f7d66fd-cac8-4542-8d90-2e1604173795"). InnerVolumeSpecName "kube-api-access-txg9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.379536 4873 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7d66fd-cac8-4542-8d90-2e1604173795-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.379571 4873 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7d66fd-cac8-4542-8d90-2e1604173795-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.379581 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txg9x\" (UniqueName: \"kubernetes.io/projected/3f7d66fd-cac8-4542-8d90-2e1604173795-kube-api-access-txg9x\") on node \"crc\" DevicePath \"\"" Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.803315 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" event={"ID":"3f7d66fd-cac8-4542-8d90-2e1604173795","Type":"ContainerDied","Data":"e3c7023fa3a3faa12fbb82e63d2c17d79d083ae336289d86303fd9d583194477"} Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.803352 4873 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3c7023fa3a3faa12fbb82e63d2c17d79d083ae336289d86303fd9d583194477" Feb 19 11:30:03 crc kubenswrapper[4873]: I0219 11:30:03.803376 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525010-drf9w" Feb 19 11:30:04 crc kubenswrapper[4873]: I0219 11:30:04.219072 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6"] Feb 19 11:30:04 crc kubenswrapper[4873]: I0219 11:30:04.231216 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524965-7h5c6"] Feb 19 11:30:05 crc kubenswrapper[4873]: I0219 11:30:05.847039 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3672337-92bc-4e97-9c9e-c0a7e7cd284b" path="/var/lib/kubelet/pods/e3672337-92bc-4e97-9c9e-c0a7e7cd284b/volumes" Feb 19 11:30:48 crc kubenswrapper[4873]: I0219 11:30:48.240279 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:30:48 crc kubenswrapper[4873]: I0219 11:30:48.242574 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:31:04 crc kubenswrapper[4873]: I0219 11:31:04.487555 4873 scope.go:117] "RemoveContainer" containerID="54b95d1d4eacbeaa7320e5a5833d0056b13e15ee90dc6b62a2553c6f88d2fff8" Feb 19 11:31:18 crc kubenswrapper[4873]: I0219 11:31:18.240339 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:31:18 crc kubenswrapper[4873]: I0219 11:31:18.241259 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:31:48 crc kubenswrapper[4873]: I0219 11:31:48.240475 4873 patch_prober.go:28] interesting pod/machine-config-daemon-qmsl7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 11:31:48 crc kubenswrapper[4873]: I0219 11:31:48.241165 4873 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 11:31:48 crc kubenswrapper[4873]: I0219 11:31:48.241225 4873 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" Feb 19 11:31:48 crc kubenswrapper[4873]: I0219 11:31:48.241997 4873 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b"} pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 11:31:48 crc kubenswrapper[4873]: I0219 11:31:48.242049 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" containerName="machine-config-daemon" containerID="cri-o://c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b" gracePeriod=600 Feb 19 11:31:48 crc kubenswrapper[4873]: E0219 11:31:48.361897 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:31:49 crc kubenswrapper[4873]: I0219 11:31:49.164060 4873 generic.go:334] "Generic (PLEG): container finished" podID="8c61760e-2955-4688-b68b-1ceeda73f356" containerID="c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b" exitCode=0 Feb 19 11:31:49 crc kubenswrapper[4873]: I0219 11:31:49.164193 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" event={"ID":"8c61760e-2955-4688-b68b-1ceeda73f356","Type":"ContainerDied","Data":"c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b"} Feb 19 11:31:49 crc kubenswrapper[4873]: I0219 11:31:49.164277 4873 scope.go:117] "RemoveContainer" containerID="94efb2b7f91f96b952ade76f54d8cf096f4da7e422f2f8758aaf2ca9208fbda3" Feb 19 11:31:49 crc kubenswrapper[4873]: I0219 11:31:49.164995 4873 scope.go:117] "RemoveContainer" containerID="c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b" Feb 19 11:31:49 crc kubenswrapper[4873]: E0219 11:31:49.165393 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:31:50 crc kubenswrapper[4873]: I0219 11:31:50.878065 4873 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f2mgl"] Feb 19 11:31:50 crc kubenswrapper[4873]: E0219 11:31:50.880004 4873 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7d66fd-cac8-4542-8d90-2e1604173795" containerName="collect-profiles" Feb 19 11:31:50 crc kubenswrapper[4873]: I0219 11:31:50.880144 4873 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7d66fd-cac8-4542-8d90-2e1604173795" containerName="collect-profiles" Feb 19 11:31:50 crc kubenswrapper[4873]: I0219 11:31:50.880508 4873 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7d66fd-cac8-4542-8d90-2e1604173795" containerName="collect-profiles" Feb 19 11:31:50 crc kubenswrapper[4873]: I0219 11:31:50.885408 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:31:50 crc kubenswrapper[4873]: I0219 11:31:50.898917 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f2mgl"] Feb 19 11:31:50 crc kubenswrapper[4873]: I0219 11:31:50.971615 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-catalog-content\") pod \"certified-operators-f2mgl\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:31:50 crc kubenswrapper[4873]: I0219 11:31:50.972084 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sps5g\" (UniqueName: \"kubernetes.io/projected/22781893-e25a-43dc-b961-51629986957a-kube-api-access-sps5g\") pod \"certified-operators-f2mgl\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:31:50 crc kubenswrapper[4873]: I0219 11:31:50.972312 4873 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-utilities\") pod \"certified-operators-f2mgl\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:31:51 crc kubenswrapper[4873]: I0219 11:31:51.074293 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-catalog-content\") pod \"certified-operators-f2mgl\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:31:51 crc kubenswrapper[4873]: I0219 11:31:51.074995 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-catalog-content\") pod \"certified-operators-f2mgl\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:31:51 crc kubenswrapper[4873]: I0219 11:31:51.076177 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sps5g\" (UniqueName: \"kubernetes.io/projected/22781893-e25a-43dc-b961-51629986957a-kube-api-access-sps5g\") pod \"certified-operators-f2mgl\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:31:51 crc kubenswrapper[4873]: I0219 11:31:51.076552 4873 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-utilities\") pod \"certified-operators-f2mgl\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:31:51 crc kubenswrapper[4873]: I0219 11:31:51.076963 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-utilities\") pod \"certified-operators-f2mgl\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:31:51 crc kubenswrapper[4873]: I0219 11:31:51.095608 4873 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sps5g\" (UniqueName: \"kubernetes.io/projected/22781893-e25a-43dc-b961-51629986957a-kube-api-access-sps5g\") pod \"certified-operators-f2mgl\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:31:51 crc kubenswrapper[4873]: I0219 11:31:51.214018 4873 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:31:51 crc kubenswrapper[4873]: I0219 11:31:51.770555 4873 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f2mgl"] Feb 19 11:31:52 crc kubenswrapper[4873]: I0219 11:31:52.196494 4873 generic.go:334] "Generic (PLEG): container finished" podID="22781893-e25a-43dc-b961-51629986957a" containerID="b2eda7221fe7f44826c7503e51dd624ab541a1643b0ad04e32da7864d6f8dab3" exitCode=0 Feb 19 11:31:52 crc kubenswrapper[4873]: I0219 11:31:52.196620 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2mgl" event={"ID":"22781893-e25a-43dc-b961-51629986957a","Type":"ContainerDied","Data":"b2eda7221fe7f44826c7503e51dd624ab541a1643b0ad04e32da7864d6f8dab3"} Feb 19 11:31:52 crc kubenswrapper[4873]: I0219 11:31:52.196877 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2mgl" event={"ID":"22781893-e25a-43dc-b961-51629986957a","Type":"ContainerStarted","Data":"adfcfff4241ec7bc41746576adca7849a23b1df91e3ca1dd5f41fd3702cfc668"} Feb 19 11:31:52 crc kubenswrapper[4873]: I0219 11:31:52.201027 4873 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 11:31:53 crc kubenswrapper[4873]: I0219 11:31:53.210801 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2mgl" event={"ID":"22781893-e25a-43dc-b961-51629986957a","Type":"ContainerStarted","Data":"21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7"} Feb 19 11:31:55 crc kubenswrapper[4873]: I0219 11:31:55.232678 4873 generic.go:334] "Generic (PLEG): container finished" podID="22781893-e25a-43dc-b961-51629986957a" containerID="21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7" exitCode=0 Feb 19 11:31:55 crc kubenswrapper[4873]: I0219 11:31:55.232767 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2mgl" event={"ID":"22781893-e25a-43dc-b961-51629986957a","Type":"ContainerDied","Data":"21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7"} Feb 19 11:31:56 crc kubenswrapper[4873]: I0219 11:31:56.245955 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2mgl" event={"ID":"22781893-e25a-43dc-b961-51629986957a","Type":"ContainerStarted","Data":"08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc"} Feb 19 11:31:56 crc kubenswrapper[4873]: I0219 11:31:56.268731 4873 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f2mgl" podStartSLOduration=2.793169823 podStartE2EDuration="6.268710229s" podCreationTimestamp="2026-02-19 11:31:50 +0000 UTC" firstStartedPulling="2026-02-19 11:31:52.200822276 +0000 UTC m=+6421.490253914" lastFinishedPulling="2026-02-19 11:31:55.676362672 +0000 UTC m=+6424.965794320" observedRunningTime="2026-02-19 11:31:56.264673487 +0000 UTC m=+6425.554105145" watchObservedRunningTime="2026-02-19 11:31:56.268710229 +0000 UTC m=+6425.558141867" Feb 19 11:32:01 crc kubenswrapper[4873]: I0219 11:32:01.214815 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:32:01 crc kubenswrapper[4873]: I0219 11:32:01.215282 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:32:01 crc kubenswrapper[4873]: I0219 11:32:01.274065 4873 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:32:01 crc kubenswrapper[4873]: I0219 11:32:01.356535 4873 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:32:03 crc kubenswrapper[4873]: I0219 11:32:03.250703 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f2mgl"] Feb 19 11:32:03 crc kubenswrapper[4873]: I0219 11:32:03.327827 4873 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f2mgl" podUID="22781893-e25a-43dc-b961-51629986957a" containerName="registry-server" containerID="cri-o://08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc" gracePeriod=2 Feb 19 11:32:03 crc kubenswrapper[4873]: I0219 11:32:03.484972 4873 scope.go:117] "RemoveContainer" containerID="c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b" Feb 19 11:32:03 crc kubenswrapper[4873]: E0219 11:32:03.485751 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:32:03 crc kubenswrapper[4873]: I0219 11:32:03.798387 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:32:03 crc kubenswrapper[4873]: I0219 11:32:03.863008 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-catalog-content\") pod \"22781893-e25a-43dc-b961-51629986957a\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " Feb 19 11:32:03 crc kubenswrapper[4873]: I0219 11:32:03.863258 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-utilities\") pod \"22781893-e25a-43dc-b961-51629986957a\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " Feb 19 11:32:03 crc kubenswrapper[4873]: I0219 11:32:03.863361 4873 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sps5g\" (UniqueName: \"kubernetes.io/projected/22781893-e25a-43dc-b961-51629986957a-kube-api-access-sps5g\") pod \"22781893-e25a-43dc-b961-51629986957a\" (UID: \"22781893-e25a-43dc-b961-51629986957a\") " Feb 19 11:32:03 crc kubenswrapper[4873]: I0219 11:32:03.864088 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-utilities" (OuterVolumeSpecName: "utilities") pod "22781893-e25a-43dc-b961-51629986957a" (UID: "22781893-e25a-43dc-b961-51629986957a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:32:03 crc kubenswrapper[4873]: I0219 11:32:03.869460 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22781893-e25a-43dc-b961-51629986957a-kube-api-access-sps5g" (OuterVolumeSpecName: "kube-api-access-sps5g") pod "22781893-e25a-43dc-b961-51629986957a" (UID: "22781893-e25a-43dc-b961-51629986957a"). InnerVolumeSpecName "kube-api-access-sps5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 11:32:03 crc kubenswrapper[4873]: I0219 11:32:03.965326 4873 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 11:32:03 crc kubenswrapper[4873]: I0219 11:32:03.965385 4873 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sps5g\" (UniqueName: \"kubernetes.io/projected/22781893-e25a-43dc-b961-51629986957a-kube-api-access-sps5g\") on node \"crc\" DevicePath \"\"" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.339997 4873 generic.go:334] "Generic (PLEG): container finished" podID="22781893-e25a-43dc-b961-51629986957a" containerID="08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc" exitCode=0 Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.340040 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2mgl" event={"ID":"22781893-e25a-43dc-b961-51629986957a","Type":"ContainerDied","Data":"08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc"} Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.340046 4873 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2mgl" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.340071 4873 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2mgl" event={"ID":"22781893-e25a-43dc-b961-51629986957a","Type":"ContainerDied","Data":"adfcfff4241ec7bc41746576adca7849a23b1df91e3ca1dd5f41fd3702cfc668"} Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.340091 4873 scope.go:117] "RemoveContainer" containerID="08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.359280 4873 scope.go:117] "RemoveContainer" containerID="21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.382241 4873 scope.go:117] "RemoveContainer" containerID="b2eda7221fe7f44826c7503e51dd624ab541a1643b0ad04e32da7864d6f8dab3" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.426049 4873 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22781893-e25a-43dc-b961-51629986957a" (UID: "22781893-e25a-43dc-b961-51629986957a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.431903 4873 scope.go:117] "RemoveContainer" containerID="08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc" Feb 19 11:32:04 crc kubenswrapper[4873]: E0219 11:32:04.432493 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc\": container with ID starting with 08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc not found: ID does not exist" containerID="08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.432542 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc"} err="failed to get container status \"08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc\": rpc error: code = NotFound desc = could not find container \"08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc\": container with ID starting with 08f172af911f92e58eb66d1cf87d64ac45aba05754469b3bc18cf51d2d6c67bc not found: ID does not exist" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.432576 4873 scope.go:117] "RemoveContainer" containerID="21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7" Feb 19 11:32:04 crc kubenswrapper[4873]: E0219 11:32:04.433040 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7\": container with ID starting with 21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7 not found: ID does not exist" containerID="21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.433075 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7"} err="failed to get container status \"21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7\": rpc error: code = NotFound desc = could not find container \"21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7\": container with ID starting with 21ef8368140c82be96b1a925b2aff609721a32bfa56b1096e8f028617eeee3f7 not found: ID does not exist" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.433094 4873 scope.go:117] "RemoveContainer" containerID="b2eda7221fe7f44826c7503e51dd624ab541a1643b0ad04e32da7864d6f8dab3" Feb 19 11:32:04 crc kubenswrapper[4873]: E0219 11:32:04.433516 4873 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2eda7221fe7f44826c7503e51dd624ab541a1643b0ad04e32da7864d6f8dab3\": container with ID starting with b2eda7221fe7f44826c7503e51dd624ab541a1643b0ad04e32da7864d6f8dab3 not found: ID does not exist" containerID="b2eda7221fe7f44826c7503e51dd624ab541a1643b0ad04e32da7864d6f8dab3" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.433549 4873 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2eda7221fe7f44826c7503e51dd624ab541a1643b0ad04e32da7864d6f8dab3"} err="failed to get container status \"b2eda7221fe7f44826c7503e51dd624ab541a1643b0ad04e32da7864d6f8dab3\": rpc error: code = NotFound desc = could not find container \"b2eda7221fe7f44826c7503e51dd624ab541a1643b0ad04e32da7864d6f8dab3\": container with ID starting with b2eda7221fe7f44826c7503e51dd624ab541a1643b0ad04e32da7864d6f8dab3 not found: ID does not exist" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.478664 4873 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22781893-e25a-43dc-b961-51629986957a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.674739 4873 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f2mgl"] Feb 19 11:32:04 crc kubenswrapper[4873]: I0219 11:32:04.684665 4873 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f2mgl"] Feb 19 11:32:05 crc kubenswrapper[4873]: I0219 11:32:05.498949 4873 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22781893-e25a-43dc-b961-51629986957a" path="/var/lib/kubelet/pods/22781893-e25a-43dc-b961-51629986957a/volumes" Feb 19 11:32:18 crc kubenswrapper[4873]: I0219 11:32:18.484536 4873 scope.go:117] "RemoveContainer" containerID="c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b" Feb 19 11:32:18 crc kubenswrapper[4873]: E0219 11:32:18.485298 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:32:33 crc kubenswrapper[4873]: I0219 11:32:33.484152 4873 scope.go:117] "RemoveContainer" containerID="c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b" Feb 19 11:32:33 crc kubenswrapper[4873]: E0219 11:32:33.485223 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:32:46 crc kubenswrapper[4873]: I0219 11:32:46.484055 4873 scope.go:117] "RemoveContainer" containerID="c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b" Feb 19 11:32:46 crc kubenswrapper[4873]: E0219 11:32:46.484804 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:32:59 crc kubenswrapper[4873]: I0219 11:32:59.485188 4873 scope.go:117] "RemoveContainer" containerID="c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b" Feb 19 11:32:59 crc kubenswrapper[4873]: E0219 11:32:59.486503 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" Feb 19 11:33:13 crc kubenswrapper[4873]: I0219 11:33:13.485158 4873 scope.go:117] "RemoveContainer" containerID="c8e5f4205b3e7c0c1ae2128e3fdd0d50b895c85b98150f05f591e71e890b4f5b" Feb 19 11:33:13 crc kubenswrapper[4873]: E0219 11:33:13.486277 4873 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qmsl7_openshift-machine-config-operator(8c61760e-2955-4688-b68b-1ceeda73f356)\"" pod="openshift-machine-config-operator/machine-config-daemon-qmsl7" podUID="8c61760e-2955-4688-b68b-1ceeda73f356" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145572416024457 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145572417017375 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145555246016521 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145555247015472 5ustar corecore